<p>With the very public split of Time Higher Education from Quacquarelli Symonds, the methodologies of these rankings may be more debated than ever. The various methodologies has led to some wild fluctuations in the rankings of individual schools. Just out of interest, to see approximately how schools fared across a range of international rankings, I took the average ranking for the top 100 schools in each of five different methodologies. A cast of usual suspects appeared that typically did well (top 100-150) in all five of the ranking methodologies suggesting that there is sort of a top rung of internationally recognized institutions. None of these methodologies focuses exclusively on undergraduate training, but some do take it into account within their methodologies. These rankings, and the average thereof, could be deemed as sort of a snapshot of a school's overall international reputation. The rankings also provide views from a variety of locations around the world. The averaged methodologies, each equally weighted, include:</p>
<p>A school had to be included in four of the five ranking methodologies to be included in the average, thus eliminating the ranking of some graduate-only universities such as UCSF.</p>
<p>Here are the results of the average ranking for the top 50 schools. I did this simply out of curiosity, and thought I'd share the results.</p>
<p>A more popular world university rankings source than RatER is HEEACT from Taiwan. RatER just factors in the rankings of four popular rankings sources (THE-QS, ARWU, HEEACT, and Webometrics) and supplements it with “expert” opinion.</p>
<p>RatER Global does not just use the other rankings services. In fact, it is quite biased toward Russian institutions that do not show up in the other rankings. It uses it’s own survey, Google search results, patent volume, etc and the results are typically the most different when compared to the other ones.</p>
<p>HEEACT doesn’t appear to have be updated in some time and their rankings are not generally accessible. (Click on your own link and see) When it has been available, it measures similar outputs as the newer Australian one, which is research output via citation index measures. I would have added HEEACT if I could have actually accessed it over the last couple of days, but it was instead replaced by the Australian RPI ranking. SCImago is another ranking of scientific output (although not restricted to just educational institutions) that could be added, as at least it is accessible. </p>
<p>Webometrics is only a ranking of a university’s presence on the web and is a self-described effort to promote open access. It has nothing to do with overall institutional quality, nor does it claim to be, and represents a single metric: web presence.</p>
<p>Sorry, I misread Wikipedia regarding RatER. -.-</p>
<p>
HEEACT is up-to-date. Their rankings from 2007 to 2010 are available on the web. I have never had problems with their website and had successfully accessed it yesterday evening and this afternoon. I clicked on my link and am on their webpage right now. If you had tried the link and still could not access the site, it may be an issue related with your computer’s settings or the nation you live in. </p>
<p>I’ve found out that HEAACT’s web site issue is browser-specific. It works in Firefox, but not Safari. They’re likely not using web standards, which is a shame. I am glad they’re updating. My goal was to average a couple different points of view. I’ll play around with their rankings and see how they fit in. </p>
<p>Thanks for letting me know that it worked for you, because otherwise I would have assumed it was just continuously off-line.</p>
<p>I understand that some people want to rank universities, and thus there has to be a method to ‘measure’ them, but you always have to keep in mind that each method has its shortcomings. </p>
<p>I had to laugh when I saw this list. I went to the ETH Zürich in Switzerland, and I’ve visited MANY colleges here in the US; I have alumnis of many different colleges reporting to me and my son is at Georgia Tech. I believe I can judge somewhat objectively. </p>
<p>This ranking is based on articles that were published; the funding system in the US almost ‘requires’ professors to publish their results quickly, sometimes/often even with ‘preliminary communications’. This system is NOT the same all over the world. At the ETH Zürich, for example, my professor had grants from the National Research Foundation and did not have any pressure to publish … and so did all other professors at the ETH! </p>
<p>Consequence? Where his peers would publish 3 or 4 papers, my professor would publish 1, possibly 2. As a result, the ETH Zürich (and all other universities without a ‘publish or perish -mentality’) ar ranked significantly below their true value.</p>
<p>I’m sure it is obvious to most, but the Title of this and similary threads really should be: “Ranking of University Ph.D.-devoted Faculties”. While the strength of a Faculty is critical to Ph.D. research, it is much less so to undergraduate students. </p>
<p>Further to that, there is a logical irrelvance to whole-University rankings at the Ph.D. level. Who would pick a Ph.D. program in Lingustics based on the strength of the Faculty in Physics? They wouldn’t even pick based on the strength of Faculty in Linguistics… they’d pick based on THEIR advisor, not an aggregate of all the advisors.</p>
<p>So what we’ve got here is a Ph.D. Faculty ranking that is not terribly relevant to undergraduates, and completely irrelevant to Ph.D. candidates.</p>
<p>Ok, here are the Top 50 average international rankings, sorted by average and then standard deviation, with HEEACT rankings now included. The lists represents the average of six different international rankings from five countries on three continents. I also eliminated any outlier ranks by using the quartile method and double checked them with Grubb’s test for outliers. With the addition of HEEACT, and the elimination of outliers, this is likely a superior list to the original one posted above. Enjoy and have at it:</p>
<p>These sorts of rankings are very misleading. I mean, very broadly speaking is it better to go to a school ranked 1-10 verses 50-60? Yes, but that’s about the extent of it.</p>