<p>"This ranking is an apples to oranges to comparison, since not all schools have engineering programs, which skew the rankings in favor of those that do.</p>
<p>Generally schools are ranked on the strength of their arts and sciences faculty. If you’re going to add engineering, why not add Law, Business, Social Work, Divinity, Dental, Nursing, whatever…?"</p>
<p>Michigan is exceptionally strong in all of those areas, save Divinity. :-)</p>
<p>Really? Because the methodology they describe for their reputational survey (the one being discussed here, not their overall university rankings which were released last October) sounds very different from that. Or are you describing something they did in the past? They describe this survey as a brand new exercise. They say they got survey responses from 31,000 academics in 149 countries, each of whom was asked to list (not to rank, just to identify) the 15 top universities worldwide in his or her own discipline. The reputational ranking just reflects the frequency with which each university was listed, aggregated across all academic disciplines. </p>
<p>It sounds pretty straightforward to me, and a more thoughtful approach than U.S. News PA which asks university administrators to rate each of up to 261 institutions; here respondents just identify the institutions they’re familiar with that they hold in highest esteem. </p>
<p>Harvard, the most frequently identified university, was given a rating of 100, and all others were ranked by their frequency of listing, expressed as a percentage of Harvard’s score; so MIT at #2 was identified at 87.2% of Harvard’s rate; Michigan at #12 was identified just about 1/4 as often at Harvard; Duke at #33 was identified about 1/10th as often as Harvard, etc. By the time you get down to #50 schools are registering in the low single-digit percentages of Harvard’s frequency of mention, so schools #50 through #100 are grouped in bands of 10, with almost imperceptible differences between them.</p>
<p>Look , I wouldn’t put too much stock in this or any other college and university ranking, but it’s an interesting exercise. Though the methodology certainly would disadvantage any university that didn’t have an engineering program or a medical school</p>
<p>Respondents were distributed this way: physical sciences 20%, engineering 20%, social science 19%, “clinical subjects” 19%, life sciences 16%, arts & humanities 7%. That’s a pretty STEM-heavy distribution, in my opinion, but whatever.; let’s not get our undies in a bunch about this, people, it’s just a reputational survey, for gosh sakes!</p>
<p>The geographic distribution of respondents was: Americas 44%, Europe 28%, Asia/Pacific/Middle East 25%, Africa 4%, which they say, based on UN data, “reflects the demographics of world scholarship.”</p>
<p>Looking at positions doesn’t paint the whole picture imo. you need to do a year-by-year comparison of the rankings by the points (i.e. compare 2011 to 2012) to get the whole picture.</p>
<p>Let’s look at Berkeley and Stanford for example. In 2012 Stanford is ranked in position 4, which is what berkeley was in 2011. However, within that time, Stanford’s reputation barely increased by (71.5-72.1) what made the shift was that berkeley’s reputation dropped significantly (74.7-71.6) Most of the universities seemed to increase in at least a few prestige points. Berkeley was one of the few ones which dropped. Some claim that this was due to the defunding of public universities. But this wouldn’t explain UCLA jump. UCLA increased drastically in prestige points (22.4-33.8) from 11-12. (more than any other university i believe.) Perhaps is just an anomaly?</p>
<p>either way, Berkeley shouldn’t be too worried considering it’s still one of the elites at the 70p range. which still says a lot considering that princeton doesn’t even crack 40.</p>
<p>But if I understand the 2012 methodology, it’s based on adding an entirely new group of academics to the group whose views are reflected in the 2011 results, adding the new 2012 data to the 2011 data. People who responded to the 2011 survey didn’t get to change their minds; what they said in 2011 stands as a subset of the combined 2012 data. So it doesn’t necessarily mean that anyone thinks UCLA is moving up and Berkeley is moving down; only that when a broader group is consulted, their relative positions change.</p>
<p>Notice also that there were some other wild swings from 2011 to 2012. UMass went from #19 in the 2011 ranking, with a score of 14.2 (ahead of, e.g., Penn and Columbia) to #39 in 2012, with a score of 8.7. Even at #39 it’s ranked way too high. Not to hate on UMass, but that much movement from one year to the next, combined with the obvious overrating of UMass even with the lower 2012 ranking, indicates that this is a highly unreliable survey, not to be taken too seriously in the particulars. But it’s interesting nonetheless.</p>
<p>I assume the over ranking of UMASS has more to do with name confusion with MIT than general unreliability of the survey. Perhaps TIMES changed the mechanics of the survey to make people less likely to mistake UMASS for MIT and that’s what caused the drop.</p>
<p>I confess that if I haven’t taken my pill, I really really like measuring stuff. However, once my meds kick in I experience a mild, involuntary retching response at the mention of “branding” in higher education. You may as well count Google hits (in fact, I think there’s a college ranking for that!) Granted, brand awareness might have some bearing on marriage prospects in upper class, third world social circles. For college choices in our own country, it means zilch whether a Pakistani molecular biologist has ever heard of Bryn Mawr College or Brandeis University.</p>
<p>If you do want to measure the impact of a university on global scholarship, bibliographic citation studies are a better (less subjective, at once more focused and comprehensive) way to do it.</p>
<p>Interestingly, this survey among academics closely correlates with citation studies. </p>
<p>Biobliographics citation studies are not necessarily more objective. They are often criticized for the metrics used which fail to capture key aspects of journals such as longevity, popularity, prestige, and periodicity. Should a citation in a an obscure Hungarian journal be guiven as much weight as a publication in Nature. There is also an increased trend towards near-instant online publication without peer review as opposed to the more traditional journal peer review. The fact that a measure is quantitative as opposed to qualitative does not make it more objective: it is just more measurable.</p>
<p>Same for the reputational survey. God know which academics they surveyed. I remember last year they picked academics who had published 20+ or something along those lines regardless of journal quality. Considering the fact that even 20-30 is quite small. And the ranking does not for the most part reflect bibliographic citations. Its no secret that oxford does poorly in citations relative to other universities. This is just a “who is popular ranking.” .</p>
<p>Thanks to the relentless efforts of some, this forum gets to discuss this utterly irrelevant to undergraduates information. How many high schoolers are there who might care about the University of California in … San Francisco?</p>
<p>This is the same garbage as usual but with a different packaging.</p>
<p>"“That has been the crux of my criticism of PA for ages. It cannot make up its mind about what exactly it’s measuring and is therefore wildly inconsistent.”"</p>
<p>Actually, the PA is very clear about what the responders should evaluate. It is the responders who show an uncanny level of dishonesty, lack of attention to simple instructions, and blatant disregard for the integrity of the exercise. </p>
<p>As long as the surveys remain unpublished in their entirety, they will remain worthy of scorn and disdain.</p>
<p>I’ve always known that Harvard, MIT, Cambridge, Stanford, Oxford, Yale, Berkeley and Princeton are the top 8 universities in the world. What surprises me was JHU wasn’t in the top 12 and UCLA was in the top 10.</p>
<p>^Other than medicine, it’s top in some but not others. Actually, if you go by USN graduate rankings in arts & sciences & engineering, Duke and Northwestern are about the same as JHU overall.</p>
<p>From the ranking methodology, i think we can infer that reputation is heavily associated with research. People are asked what universities are prestigious in their field. Not to mention that UCLA probably has way more top grad programs than duke and northwestern do. So if that’s all reputation’s based on i wouldn’t be surprised that UCLA placed higher than duke and northwestern.</p>
<p>Notice that Caltech doesn’t even crack the top 10 although it’s listed no. 1 in the complete ranking. I imagine this has to do with the fact that it’s so tiny, although they probably publish very high quality research, it’s isn’t much.</p>
<p>I imagine a big reason Caltech did so poorly in the reputation rankings is that it got very few responses in the arts and humanities, clinical sciences, and social sciences categories. The complete rankings are mostly adjusted for per capita effects so that obviously allowed Caltech to do better.</p>
<p>To be honest about it, only an ignorant would say Northwestern isn’t prestigious. It has a top 14 law school, top 10 business school, top medical school, top undergrad and postgrad schools. The same can be said to Duke.</p>