<p>Schools like USC, Vanderbilt, Emory, Dartmouth, etc are nowhere to be found on international rankings. Which ones is more accurate?</p>
<p>You mean ARWU? Neither one’s necessarily “more accurate”; they just weigh different criteria. Rankings in and of themselves aren’t that useful because they’re based on subjective decisions about what matters more- check the data they provide.</p>
<p>It all comes down to what you’re looking for in a college.</p>
<p>USNWR rankings tend to have a high correlation with undergraduate admissions selectivity.</p>
<p>International rankings of US schools tend to emphasize PhD programs, since PhD programs are the more affordable way for international students to attend.</p>
<p>Because in the rest of the world the bottom line is academic rigor, not Greek-life, resort-like dorms, football/basketball, and kumbaya cr*p:
<a href=“http://m.fastcompany.com/3031867/whats-wrong-with-higher-education-ivory-tower-dissects-the-answer”>http://m.fastcompany.com/3031867/whats-wrong-with-higher-education-ivory-tower-dissects-the-answer</a></p>
<p>The best question to ask is not, “which one is more accurate?”, but “which one is more appropriate for your needs?” Different rankings use different criteria, which reflect different judgements about what is most important to the ranking’s intended market. </p>
<p>ARWU Criteria
“ARWU uses six objective indicators to rank world universities, including the number of alumni and staff winning Nobel Prizes and Fields Medals, number of highly cited researchers selected by Thomson Scientific, number of articles published in journals of Nature and Science, number of articles indexed in Science Citation Index - Expanded and Social Sciences Citation Index, and per capita performance with respect to the size of an institution”
(<a href=“http://www.shanghairanking.com/aboutarwu.html”>http://www.shanghairanking.com/aboutarwu.html</a>)</p>
<p>USNWR Criteria
Academic reputation (which probably reflects many of the same factors ARWU measures), plus:
student selectivity, faculty resources (full time v. part time faculty, S::F ratio, average class size), graduation/retention rates, graduation rate “performance”, financial resources, alumni giving.
<a href=“http://www.usnews.com/education/best-colleges/articles/2013/09/09/best-colleges-ranking-criteria-and-weights”>http://www.usnews.com/education/best-colleges/articles/2013/09/09/best-colleges-ranking-criteria-and-weights</a></p>
<p>Where are the biggest differences in their rankings of top schools?
US Schools That Make the Top Ten in Both Rankings
Caltech, Chicago, Columbia, Harvard, MIT, Princeton, Stanford,Yale</p>
<p>US Schools That Only Make the ARWU Top Ten
Berkeley, UCLA
Why don’t they make the USNWR top 10? Because they have bigger undergraduate classes, lower admission selectivity, and lower graduation/retention rates than the USNWR top schools (according to the measurements USNWR uses). </p>
<p>Schools That Only Make the USNWR Top Ten
Dartmouth, Duke, UPenn
Why don’t they make the ARWU top 10? Presumably because their alumni and staff win fewer Nobels/Fields Medals, and because their faculty research output isn’t as strong as the top 10 ARWU schools (according to the measurements ARWU uses).
<a href=“List of Nobel laureates by university affiliation - Wikipedia”>http://en.wikipedia.org/wiki/List_of_Nobel_laureates_by_university_affiliation</a></p>
<p>Neither ranking is based on Greek life, dormitory quality, or sports programs (unless spending on such things has some affect on the US News “financial resources” measurement).</p>
<p>Frankly, rankings are only of minor value. It is perfectly safe to ignore them. Better to pay attention to the SAT/ACT scores of entering freshmen and to explore the majors offered.</p>
<p>For an interesting angle on college rankings, read the following:
<a href=“The Trouble with College Rankings | The New Yorker”>The Trouble with College Rankings | The New Yorker;
<p>Nobody needs college rankings to understand that Princeton is better than, say, Ohio State University. Yet, this does not mean that a person would necessarily choose to attend Princeton instead of Ohio State. There are good reasons to choose Ohio State over Princeton (assuming you got admitted). This has to do with priorities. Rankings have their priorities, and you have yours. If you know what your priorities are then you will make a good choice. Don’t accept the priorities of other people – especially people who rank colleges and universities.</p>
<p>They serve different markets. So, say, in Asia, you drop Dartmouth’s name. “What???” </p>
<p>Berkeley. WOW AMAZING HERE’S A JOB! lol.</p>
<p>An interesting article on what the U of Rochester would have to spend to get into the top 20 of USNWR.<br>
<a href=“Ralph Kuncl’s study in Research in Higher Education explains the significance of U.S. News & World Report rankings and how to move up to the top 20.”>http://www.slate.com/articles/life/inside_higher_ed/2014/06/ralph_kuncl_s_study_in_research_in_higher_education_explains_the_significance.html</a></p>
<p>A lot of it has nothing to do w academic quality.</p>
<p>
</p>
<p>It’s a simple answer really: different rankings use different criteria to rank the universities. Given different criteria, certain schools will perform better or worse than other schools when you compare rankings. </p>
<p>From what I’ve seen, international rankings place a high value on research. Research output tends to be something bigger and richer universities are better at since they’re able to hire more top faculty and provide them with better facilities.</p>
<p>US News tries to evaluate the ‘best undergraduate education.’ It considers a variety of factors from student-to-faculty ratio to selectivity to academic reputation. Just like research, doing well on these factors also requires a lot of money.</p>
<p>So, for a university to do well on both types of rankings, it needs to spend money on having top faculty and facilities; and it also needs to spend money on ensuring things like student to faculty ratios are low, making sure that students have adequate counseling, etc. Doing both of these requires A LOT of money. So it shouldn’t be surprising that universities that do well on both types of rankings tend to be some of the richest and most prestigious schools in the country. (e.g. institutions like: Harvard, Stanford, Yale, Princeton, etc.)</p>
<p>You asked
Easy. Neither. All ranking systems are garbage. You have received good answers on here, except for GMTplus7. But the bottom line is that it is only what is important to you about a school that matters, and that goes beyond academics. Little known fact: When USNWR was first cooking up this travesty they call naming “The Best College”, the initial parameters they came up with didn’t end up with Harvard and Yale at the top. They never revealed who did end up there, but they fiddled with the formula until the schools they knew “should” be at the top were. Real scientific, isn’t it. But let me give you an example of how subjective this is.</p>
<p>Suppose one of the criteria was the percentage of classes taught by a tenure track professor and not grad students or adjuncts. After all, at many of these schools if you start today, you could easily pay $250,000 over the 4 years it takes you to graduate, assuming you can even do it in 4 years. A quarter of a million dollars$$$!!! Think about that, and don’t you think you deserve to have the best profs possible teaching every class? Now throw in another measure that takes into account the percentage of lectures these great profs actually give of the number scheduled, instead of being off consulting with big corporations or the White House and Congress or appearing on MSNBC and CNN. Now substitute those factors for “peer assessment” that is used currently in USNWR. After all, what does someone hundreds or even thousands of miles away know about what really goes on at Princeton or Columbia or Wake Forest or Purdue or UC-Irvine or Trinity (Gee, do they mean the one in San Antonio or Connecticut? And which Loyola or St. Mary’s was that again?) on a regular basis at the undergraduate level? Not much, that’s for sure. And now they give some of that vote to high school guidance counselors! Really? They all just know it is Harvard, so it must be great. And let’s give those a pretty strong weighting in the formula, because that is a lot of the reason you chose Elite U was to get taught by the best, right? Maybe the same 22.5% that peer assessment gets. Guess which schools would plummet in the rankings if that were the case, yet they are just as reasonable as variables in the formula as what USNWR uses. I am sure something similar can be said for the World Rankings, which has to be even more like comparing apples and lava rocks.</p>
<p>I could write a book on all the problems with the USNWR rankings (and I should!!). But as some on here have said, look at the schools based on what you think you would most like to experience for 4 years, including non-academic factors. College is a life experience, not just a classroom experience. What is the #1 school for you will be different than what is the best school for your classmate 99% of the time, even if you knew you could get in anywhere and pay for it no problem.</p>
<p>Throw the rankings away. Free yourself from the shackles of USNWR tyranny!! </p>
<p>
</p>
<p>Actually, it is. Or, that is to say, it’s probably about as scientific as the state-of-the-art in data modeling allows.</p>
<p>A typical approach to data modeling (as I understand it anyway) is that you use human experts to represent “ground truth” in whatever you’re trying to model (whether that is grammatical sentences for a software translation system, or the best beer for a manufacturing process). Then you try to isolate sets of features that an automatic process could exploit to get closer and closer to that “ground truth”. For example, would average SAT scores, alone, result in a ranking that is about the same as the “ground truth” ranking of colleges by your supposed experts? If not, you add other features, subtract features, and jigger the weightings in your formula. You keep doing this until the features, weights, and algorithm can produce a recipe for beer, or a ranking of colleges, that looks pretty close to what the experts say the best beers or colleges are. Then you apply the same formula to other data that the experts never considered before, and see if the results look plausible. </p>
<p>If your “experts” are American HS guidance counsellors, you may wind up with a very different data model for colleges than if your “experts” are international university administrators. And the actual ranking will not perfectly match even the judgements of your own experts. At best it gets asymptotically closer … but as it does so, maybe the “expert” taste in colleges (dog food, beer, whatever) is drifting from the original model.</p>
<p>On some world rankings, UT- Austin and UW-Madsion are higher than Vanderbilt and Emory and USC</p>
<p>@tk21769 - I couldn’t disagree more, there is absolutely nothing scientific about it. Modeling data to meet some predetermined “truth” isn’t science in the least, because science is about objective truths, not subjective polling.</p>
<p>I didn’t read the article from The New Yorker that was mentioned earlier until after I made my post, but read it and see if you still believe what you posted. <a href=“The Trouble with College Rankings | The New Yorker”>The Trouble with College Rankings | The New Yorker; Of course, in your scenario everything depends on that “ground truth”, which to me is a very flawed assumption. Mostly because we are talking about the undergraduate experience here, and most people have no idea what that is like at other schools and cannot reasonably separate it from the research reputation (grad school) of these schools. The collective “wisdom” isn’t always a substitute for the “truth”, whatever that is in a highly subjective area like this.</p>
<p>@fallenchemist It’s basically calibrating the results with “accepted” rankings so that the results- as @tk21769 pointed out- basically become an extension of accepted judgments- i.e., a useful list for anyone who buys into the accepted judgments. There’s nothing scientific about rankings to begin with, of course- they <em>can’t</em> be purely scientific, so calling them “unscientific” is a moot point and not a criticism because any ranking of what’s “better” is inherently a subjective judgment.</p>
<p>
</p>
<p>Except that’s not what happens in practice. The US News tweaks and adjusts their methodology just slightly enough that they get controversial ratings so that they sell magazines and to get universities try to improve their place within the ranking. There’s no ‘ground truth’ in Chicago being ranked above Stanford, and there’s no ‘ground truth’ in WUSTL being ranked above Berkeley.</p>
<p>As I’ve said before, the subjective, non-scientific use of weighting methodology makes the whole undergraduate ranking a joke. What’s the point of paying attention to the ranking if we call it illegitimate if it doesn’t confirm our preconceived notions? That doesn’t sound very scientific to me.</p>
<p>As a website, US News is best used for peer assessment in their graduate rankings, and for data on particular colleges and universities. It’s pretty useless for telling us what the ‘best colleges’ are imo.</p>
<p>@beyphy Chicago’s not ranked above Stanford; it’s just ranked at the same spot- and it’s one of 7 universities that’s on both the ARWU and USNWR top 10.</p>
<p>It’s only “legitimate” insofar as it extends our preconceived notions. Suppose you were asked to make a college ranking yourself- you knew that HYPMS, Rice, Caltech, Berkeley (just throwing names out there) would be in the top 10 but you don’t have the time to rank <em>every</em> school for your research. If you do what USNWR did (in theory) you should be able to perform your research more efficiently by ranking the other colleges in terms of how well they fit your preconceived notions (in this case, how close they are the HYPMS + RCB). It’s a useful tool, but you’re right in that it’s unscientific and subjective. It’s definitely overvalued, but if you find yourself in the segment of the population that agrees with UNSWR’s top 10, you probably will find the remainder of their rankings useful and acceptable.</p>
<p>@dividerofzero - Yes, I understand that they could never be scientific by definition. Which is one reason I disagree with @tk21769 when he says “Actually, it is (scientific). Or, that is to say, it’s probably about as scientific as the state-of-the-art in data modeling allows.” The problem is that USNWR acts like it is some kind of scientific finding, and the public buys into that far too often. I mean many admins and alums sweat bullets over a few spots difference in the ranking position of their school. But in the end it is as you say, just a distorted echo of what the common perception of the general public is, so what use is it really? I completely disagree with your last statement about the way they extend the formula to other schools being acceptable, even if you agree with their top 10. They get the “top schools” to line up as the public expects they would, and then extend the formula that got them there to hundreds of other schools. But as the article in the New Yorker says quite correctly, how do you remotely compare Yeshiva with Penn State?? It’s junk, pure and simple.</p>
<p>BTW, most people wouldn’t put Rice in the top 10 if you are just going off general public perception. Maybe or maybe not Berkeley. Which is most likely why neither show up in the top 10 for USNWR.</p>
<p>@fallenchemist Yeah, those would be the 7 schools I would start the top 10 with. Definitely agree when you say USNWR is overrated. I also hate the focus on general rankings; some schools- especially public flagships- tend to vary a lot between fields- UT-Austin, for example, is a ridiculously great engineering school but not in the top 10 overall.</p>
<p>@dividerofzero - That is interesting, because in a million years I wouldn’t put Berkeley in the top 10 for undergrad. Lots of their grad programs absolutely. ucbalumnus is mad at me now!! Anyway, just goes to show how perceptions and valuing of certain aspects of a school can vary so widely. Of course it is really unfair of me to really even phrase it that way because I would steadfastly refuse to rank schools for the undergraduate experience on principle alone. Not for a million dollars (OK, totally lying in that last line, but i wouldn’t mean it).</p>
<p>Pretty easy. The foreigners who developed the ARWU and THES had to come up with a methodology that would satisfy its readership. They developed a hodgepodge of dubious elements that would list a few big names and still present some of the international schools in a decent place. The result is a ranking that has almost NOTHING to do with … undergraduate education and the concerns of THIS forum. The ARWU and THES are rankings of graduate programs culled from a pre-established list of schools, and ignore baccalaureate colleges, and hence the quality of undergraduate education. </p>
<p>All in all, the international rankings only please the fans of schools that are pegged differently by the USNews, including some awful academic factories where UGs are ignored and poorly educated. People who value an undergraduate education should look elsewhere than at what the pseudo-scientists in China and the UK concocted with glee. </p>