Is Peer Assessment in USNWR Rankings based on Undergrad or Grad Reputation?

<p>I checked the National Academies of Science and Engineering member directories and pulled some university faculty membership numbers...Here's the data (I think the database hasn't been updated with new members announced this year).</p>

<p>Institution, # NAS members, # NAE members, total, PA score
MIT, 105, 110, 215, 4.9
Stanford, 126, 87, 213, 4.9
Berkeley, 128, 75, 203, 4.8
Harvard, 150, 18, 168, 4.9
Caltech, 73, 30, 103, 4.7
Princeton, 71, 21, 92, 4.9
Wisconsin, 44, 18, 62, 4.1
UCLA, 28, 17, 45, 4.2
Michigan, 19, 22, 41, 4.5
U Chicago, 38, 1, 39, 4.6
USC, 10, 19, 29, 4.0
WUSTL, 16, 3, 19, 4.1
Rice, 3, 12, 15, 4.0
Vanderbilt, 4, 1, 5, 4.0
Emory, 2, 0, 2, 4.0
Notre Dame, 0, 2, 2, 3.9
Georgetown, 0, 0, 0, 4.0</p>

<p>What are the numbers for Duke and Northwestern? :)</p>

<p>^ Hi Sam...:)</p>

<p>Northwestern, 16, 20, 36
Duke, 18, 3, 21</p>

<p>""The definition and application of class rank varies so much from school to school and state to state it's hardly useful. The average UW GPA for Wisconsin and Michigan is 3.77 and 3.75 respectively but UW reports 60% in top 10% and UM 90%. Something seems off."</p>

<p>Isn't it nice that US News uses the top 10% in their rankings? There is an inherent conflict of interest a worst and at best a lack of standardization associated in using this metric with no effort to standardize reported scores. The more I LEARN ABOUT US News ranking, the more I am amazed at their lack of common sense in standardizing the data so a number can be directly and accurately compared between any school in the survey. They make no attempt to adjust salaried based on location either. I think if a NEUTRAL statistician teacher were to examine their methodology, they would be forced to the conclusion that it's pretty much "Bush League". It makes you winder if this is oversight or dubious intentions. It might make US NEWS look less credible if they all of a sudden ranked Wisconsin above Michigan, even thought it might be true standardized data is fed into their formulas.</p>

<p>
[quote]
They make no attempt to adjust salaried based on location either. I think if a NEUTRAL statistician teacher were to examine their methodology, they would be forced to the conclusion that it's pretty much "Bush League".

[/quote]

Give 'em a bit more credit, Tom...USNews claims they do adjust salaries for location:

[quote]
Faculty compensation. The average faculty pay and benefits are adjusted for regional differences in cost of living. This includes full-time assistant, associate, and full professors. The values are taken for the 2005-2006 and 2006-2007 academic years and then averaged. (The regional differences in cost of living are taken from indexes from Runzheimer International.)

[/quote]
</p>

<p>The real problem is this:

[quote]
Expenditures per student. Financial resources are measured by the average spending per full-time-equivalent student on instruction, research, public service, academic support, student services, institutional support, and operations

[/quote]

If this is the case, and they include research expenditures, universities without medical schools are at a disadvantage.
USNews makes no claim for removing graduate program expenditures. My guess is because it would be too time consuming, and not enough detail provided in which to do so...</p>

<p>The category rewards spending inefficiencies.</p>

<p>Major Faculty awards won will vary a bit year to year and gives more weight to the liberal arts side. For most recent year score.
"The Top 200 Institutions--Faculty Awards
(2006)"<br>
X<br>
"Top 47 Institutions
in Faculty Awards
(2006)" Number of Awards National Rank Control Rank "Institutional
Control"
University of Michigan - Ann Arbor 51 1 1 Public
Harvard University 50 2 1 Private
Johns Hopkins University 49 3 2 Private
Yale University 48 4 3 Private
University of California - San Diego 48 4 2 Public
Stanford University 45 6 4 Private
University of California - Berkeley 44 7 3 Public
University of Wisconsin - Madison 42 8 4 Public
University of California - Los Angeles 42 8 4 Public
Duke University 40 10 5 Private
Massachusetts Institute of Technology 38 11 6 Private
Northwestern University 35 12 7 Private
Columbia University 35 12 7 Private
New York University 34 14 9 Private
University of Pennsylvania 31 15 10 Private
University of California - San Francisco 30 16 6 Public
Pennsylvania State University - University Park 28 17 7 Public
Princeton University 28 17 11 Private
University of Washington - Seattle 27 19 8 Public
University of Texas - Austin 27 19 8 Public
Washington University in St. Louis 26 21 12 Private
University of North Carolina - Chapel Hill 26 21 10 Public
Vanderbilt University 25 23 13 Private
University of Florida 25 23 11 Public
University of Chicago 25 23 13 Private
University of Minnesota - Twin Cities 24 26 12 Public
University of Pittsburgh - Pittsburgh 23 27 13 Public
University of California - Irvine 22 28 14 Public
University of Illinois - Urbana-Champaign 22 28 14 Public
University of Southern California 21 30 15 Private
Cornell University 21 30 15 Private
University of Utah 21 30 16 Public
Emory University 21 30 15 Private
University of California - Davis 20 34 17 Public
University of Massachusetts - Amherst 19 35 18 Public
University of Arizona 18 36 19 Public
Carnegie Mellon University 18 36 18 Private
University of Notre Dame 17 38 19 Private
University of Maryland - College Park 17 38 20 Public
Ohio State University - Columbus 17 38 20 Public
Texas A&M University 16 41 22 Public
University of Virginia 16 41 22 Public
Scripps Research Institute 16 41 20 Private
University of Colorado - Boulder 15 44 24 Public
University of Oregon 15 44 24 Public
Rutgers the State University of NJ - New Brunswick 15 44 24 Public
Boston University 14 47 21 Private
University of California - Santa Barbara 14 47 27 Public
California Institute of Technology 14 47 21 Private
Purdue University - West Lafayette 14 47 27 Public</p>

<p>Awards counted
Beckman Young Investigators, 2006
• BurroughsWellcome Fund Career Awards, 2006
• Cottrell Scholars, 2006
• Fulbright American Scholars, 2006-07
• Getty Scholars in Residence, 2006-07
• Guggenheim Fellows, 2006
• Howard Hughes Medical Institute Investigators, 2006
• Lasker Medical Research Awards, 2006
• MacArthur Foundation Fellows, 2006
• AndrewW. Mellon Foundation Distinguished Achievement
Awards, 2006
• National Endowment for the Humanities (NEH) Fellows,
2007
• National Humanities Center Fellows, 2006-07
• National Institutes of Health (NIH) MERIT (R37) FY 2006
• National Medal of Science and National Medal of
Technology, 2005
• NSF CAREER awards (excluding those who are also
PECASE winners), 2006
• Newberry Library Long-term Fellows, 2006-07
• Pew Scholars in Biomedicine, 2006
• Presidential Early Career Awards for Scientists and Engineers
(PECASE), 2006
• RobertWood Johnson Policy Fellows, 2006-07
• Searle Scholars, 2006
• Sloan Research Fellows, 2006
• US Secretary of Agriculture Honor Awards, 2006
•WoodrowWilson Fellows, 2006-07</p>

<p>The USNWR PA rankings appear to be more influenced by the reputations of graduate programs more so than the quality of undergraduate education.</p>

<p>Hence, large state schools like UCB, UMich, UVA, etc. having pretty high PA rankings (I’m not saying that state schools can’t provide quality undergraduate education, but there’s a reason why USNWR takes faculty/student ratio, etc. into its final calculations), as well as larger Ivies w/ big graduate programs, such as Cornell and Penn, having higher PA rankings than smaller, undergraduate focused Ivies like Brown and Dartmouth (the PA rankings corellate w/ Cornell and Penn having more top 10/20/30 graduate programs in the major, traditional areas of study than Brown or Dartmouth).</p>

<p>Otoh, PA rankings based on quality of graduate programs doesn’t seem to be a perfect metric, since some schools seem to have a higher PA ranking than what their graduate programs would entail.</p>

<p>To show how faculty awards are pretty similar YTY--Usual suspects.
2005
"The Top 200 Institutions--Faculty Awards
(2005)"<br>
X<br>
"Top 47 Institutions
in Faculty Awards
(2005)" Number of Awards National Rank Control Rank "Institutional
Control"
Harvard University 74 1 1 Private
Johns Hopkins University 63 2 2 Private
Stanford University 46 3 3 Private
Yale University 45 4 4 Private
Columbia University 43 5 5 Private
University of Michigan - Ann Arbor 42 6 1 Public
University of Wisconsin - Madison 42 6 1 Public
Duke University 40 8 6 Private
University of California - Berkeley 40 8 3 Public
Washington University in St. Louis 39 10 7 Private
University of California - San Diego 37 11 4 Public
Northwestern University 36 12 8 Private
University of California - Los Angeles 36 12 5 Public
Massachusetts Institute of Technology 35 14 9 Private
Princeton University 34 15 10 Private
University of California - San Francisco 30 16 6 Public
University of North Carolina - Chapel Hill 30 16 6 Public
University of Pittsburgh - Pittsburgh 30 16 6 Public
University of Washington - Seattle 29 19 9 Public
University of Chicago 28 20 11 Private
University of Pennsylvania 26 21 12 Private
University of Illinois - Urbana-Champaign 26 21 10 Public
University of Texas - Austin 25 23 11 Public
New York University 25 23 13 Private
University of Minnesota - Twin Cities 23 25 12 Public
University of California - Irvine 23 25 12 Public
Cornell University 23 25 14 Private
Vanderbilt University 23 25 14 Private
University of California - Davis 22 29 14 Public
Pennsylvania State University - University Park 21 30 15 Public
University of Illinois - Chicago 20 31 16 Public
University of Maryland - College Park 20 31 16 Public
University of Southern California 19 33 16 Private
University of Florida 19 33 18 Public
University of Colorado - Boulder 19 33 18 Public
Emory University 18 36 17 Private
Ohio State University - Columbus 18 36 20 Public
University of Virginia 18 36 20 Public
Rutgers the State University of NJ - New Brunswick 17 39 22 Public
Stony Brook University 16 40 23 Public
University of Utah 16 40 23 Public
University of Texas SW Medical Center - Dallas 16 40 23 Public
University of Arizona 16 40 23 Public
California Institute of Technology 16 40 18 Private
University of California - Santa Barbara 16 40 23 Public
Oregon Health & Science University 15 46 28 Public
University of Rochester 14 47 19 Private
University of Cincinnati - Cincinnati 14 47 29 Public
North Carolina State University 14 47 29 Public
University of Massachusetts Medical Sch - Worcester 14 47 29 Public
Dartmouth College 14 47 19 Private
Case Western Reserve University 14 47 19 Private
Boston University 14 47 19 Private
Michigan State University 14 47 29 Public
Indiana University - Bloomington 14 47 29 Public</p>

<p>One of the many, many sources of bias in the PA is school size.
xiggi wonders about Harvey Mudd and Berkeley.
Berkeley is the world leader in producing numbers of PhDs (correct me if I am wrong) and their graduates populate academia in mammoth numbers. I would respect the PA a tad more if the raters were not allowed to rank their schools of employment OR the schools where they obtained their final degree. This is the same source of bias that leads to The Ohio State University, Purdue, Indiana, and Minnesota having PAs above Tufts, Wake Forest and Brandeis. Size even trumps northeast regional bias!
Same with rating schools by numbers of "distinguished faculty" by some honor. Giantism wins out.</p>

<p>


</p>

<p>This is true. I've actually heard college administrators discuss ways they might increase expenditures to improve their USN rankings. One example: raise tuition and recycle the incremental revenue into increased financial aid, keeping net costs to the student constant but showing higher nominal expenditures in the form of increased financial aid, thereby boosting "expenditures per student." Or, raise faculty salaries and throw a lot of lavish parties for the students. </p>

<p>The more general point: this is the only enterprise in which you're rewarded for having the highest cost per unit of production.</p>

<p>No matter how you slice it the schools you mentioned are pretty weak in faculty with major awards, etc. areas . Like it or not that's a major way faculty measure each other as all those are national competitions open to all.</p>

<p>Tufts had 9 awards
Everyone's favorites Rice and Wake Forest had a big 5 putting them in with the University of North Dakota</p>

<p><a href="http://mup.asu.edu/Top200-III/2_2006_top200_faculty.xls%5B/url%5D"&gt;http://mup.asu.edu/Top200-III/2_2006_top200_faculty.xls&lt;/a&gt;&lt;/p>

<p>Thanks UCBChemEGrad! When next time some cocky Duke people on CC think they are as good as Stanford and better than schools like Northwestern/Cornell, I'll show them those numbers. ;)</p>

<p>


</p>

<p>I think the faculty in a particular discipline actually know a lot more about the strengths and weaknesses of other faculties in their discipline than do the presidents, provosts, and admissions officers US News surveys in its PA rating. It would cost a bit more, but what if US News were to ask the department chairs to rank other faculties in their discipline, perhaps excluding the school where they are presently employed as well as the schools where they got both their undergraduate and terminal degrees (noting that "old school ties" are often stronger to one's undergraduate alma mater). Then compile the results to rate faculties by discipline, and come up with some formula to integrate the results across disciplines to get an overall rating of faculty strength. </p>

<p>This would still be biased in favor of scholarship over classroom teaching, of course, but I maintain there's no way to rate the effectiveness of teaching at another institution, or even within your own institution for that matter; it's non-transparent except to the students and to the individual faculty member, and they're not in a position to make inter-institutional comparisons. Consequently, we're only deluding ourselves if we think the quality of the teaching is reflected in the PA rating, and it shouldn't even be included in the questionnaire. </p>

<p>Scholarship is another matter. Opinions may vary by individual, but generally the worth of scholarship is measured by its quantity and its impact, i.e., how much it is read and cited, how much it is respected by other knowledgeable scholars in the field, how much it is seen as original, path-breaking, and influential. Faculty generally know this if they know their field.</p>

<p>Nor would this methodology need to be confined to major research universities. Economists at Amherst know who the other good economists are, both at other LACs and at RUs. And philosophers at Michigan have a pretty good idea which LACs have good philosophy programs. Consider this description of undergraduate philosophy programs by Brian Leiter, a phislosopher and legal scholar formerly at Texas, now at Chicago, who regularly puts out his own ranking of graduate programs in philosophy (blogging as "The Philosophical Gourmet"):</p>

<p>The</a> Philosophical Gourmet Report 2006 - 2008 :: Undergraduate Study</p>

<p>
[quote]
Thanks UCBChemEGrad! When next time some cocky Duke people on CC think they are as good as Stanford and better than schools like Northwestern/Cornell, I'll show them those numbers.

[/quote]
</p>

<p>Don't forget the no. of big, traditional programs ranked in the top 10/20/30.</p>

<p>"Give 'em a bit more credit, Tom...USNews claims they do adjust salaries for location:" </p>

<p>Sorry, I should have been more clear. I US News doesn't regionally adjust the starting salaries of graduates. They don't do it for MBA staring salaries either. Makes ya wonder...If they know enough to do it for prof's salaries, why not graduate salaries, too? I think that the reason is because making these adjustments, although would make the metric fair, would show more parity between good schools that aren't household names and Ivy calliber schools. This wouldn't sell as many issues. Just my 2 cents.</p>

<p>
[quote]
I think the faculty in a particular discipline actually know a lot more about the strengths and weaknesses of other faculties in their discipline than do the presidents, provosts, and admissions officers US News surveys in its PA rating. It would cost a bit more, but what if US News were to ask the department chairs to rank other faculties in their discipline, perhaps excluding the school where they are presently employed as well as the schools where they got both their undergraduate and terminal degrees (noting that "old school ties" are often stronger to one's undergraduate alma mater). Then compile the results to rate faculties by discipline, and come up with some formula to integrate the results across disciplines to get an overall rating of faculty strength.

[/quote]
</p>

<p>That would, indeed, represent a massive improvement. However, nothing would make the PA better than expanding its categories AND making the entire survey PUBLIC on a searchable website. To put it simply, I'd like to know who is doling out the scores and to whom!</p>

<p>Since the above is utopian in nature, I would settle for a mini Sarbanes-Oxley derived statement that will require every responder to acknowledge having read and understood the instructions as well as having used commonly acceptable data to prepare their answers before signing their name onto the survey. </p>

<p>I am "afraid" that such an onerous restriction would bring the response rate down to an even more dismal percentage, and that between the deans and provosts who have now rejected the reputation survey (and underscored its negative impact) and the ones who could no longer manipulate in impunity, we'd have a pretty small number for Mr. Morse and his troops.</p>

<p>Xiggi, I take it you aren't going to graduate school at Berkeley?</p>

<p>UCBChemEngineer-
Take a look at this article from 2000, when CalTech was ranked #1 by a huge margin. It highlights a lot of the statistically dubious practices of USNews and when they do and don't use standardization to create a methodology to conform to a list rather than a list to generated by a neutral outcome</p>

<p>"This was done in large part by rejecting a common statistical technique known as standardization and employing an obscure weighting technique in the national universities category. Consider the data from the 1997 book, the last year the numbers for overall expenditures were posted publicly. Caltech spent the most of any college at $74,000 per student per year, Yale spent the fourth-most at $45,000 and Harvard spent the seventh-most at $43,000. According to the U.S. News formula applied in every single category except for national universities, the absolute rates of spending would be compared and Caltech would be credited with a huge 40-percent category advantage over Yale. Under the formula used solely in this category the difference between Caltech and Yale (first place and fourth place) was counted as essentially the same as the difference between Yale and Harvard (fourth place and seventh place) even with the vast difference in absolute spending."</p>

<p>According to sources close to the magazine, a bitter internal struggle broke out when it became clear that Caltech was going to come out on top in the late spring of 1999 after the rankings had been changed to count every category the same way. Fallows' replacement Stephen Smith and new Special Projects Editor Peter Cary were both reportedly shocked to see that, under the new formula Graham had recommended, the conventional wisdom of the meritocracy would be turned upside down, and there were discussions about whether the rankings should be revised to change the startling results. (Morse and Cary both deny this.) Eventually, a decision was made to keep the new formula and U.S. News received a hefty dose of criticism from baffled readers. Morse declined to say how the formula has been changed for the rankings that will be printed on September 4th of this year. But if Caltech's ranking drops and one of the three Ivies recovers its crown, read the small print carefully. Caltech's advantage over the second ranked school last year was an astronomical seven points (more than the difference between #2 and #8). The methodology would have to be monkeyed with substantially to drop Caltech out of the top spot</p>

<p>"Playing</a> With Numbers" by Nicholas Thompson</p>

<p>Again, if they were fair in and honest in their methodology, I think they would highlight parity all over the place and the usual suspects who are enjoying their hubris while on top of the rankings would be more fairly placed down below some schools like Michigan and U-Texas.</p>

<p>
[quote]
One example: raise tuition and recycle the incremental revenue into increased financial aid, keeping net costs to the student constant but showing higher nominal expenditures in the form of increased financial aid, thereby boosting "expenditures per student."

[/quote]
Actually, this is one of the most efficient things a college can do. In a perfect world, everyone would be paying what they can afford for college. In the current system, the family that makes $200,000 annually and the family that makes $1,000,000 annually both pay the same amount because neither qualifies for financial aid (most likely). However, it's pretty safe to say that the family with the $1,000,000 annual income will find a $50,000 price tag much more affordable.</p>