The shortcomings of the US News Rankings are legion, and the methodology employed is fatally flawed. Just a few examples: Annual Giving value of 5% shortchanges public universities (graduates rationally assume that the uni is funded by the state so why should they give); Graduation and Retention Rates value 22.5%, so we should discount Stanford because some of its best and brightest students drop out to invent the 21st Century; Reputation value of 22.5%, determined from the responses of only 39% of those solicited plus the responses of only 9% of the H S college counselors solicited (averaged with the 7% who responded last year, etc…); Student Selectivity is way undervalued at only 10% of 12.5%; Faculty Resources valued at 20% hurts places such as Berkeley where some students may have to sit in the 8th row to listen to their Nobel Prize winning Professor.
Because of the numerical values, these either highly subjective metrics (reputation) or irrelevant or mis-weighted metrics masquerade as an objective ranking.
I think the most accurate ranking is the one that students themselves in the aggregate do every year: some admixture of number of applications/admit rate/yield, plus a good dose of common sense.
@Prof99 The admit rate doesn’t necessarily mean anything. UVA’s admit rate (28%) is higher than American’s (25%), but we all know that UVA is better and has higher standards of admission than American does
@Prof99 To add to that, while it is not as sexy and is probably not as good click bait, a set of rankings that broke out the various schools into more useful subcategories would be useful too. Rural/Urban; Size; Academic Strengths by Subject area, etc.
And most important would be some sort of realistic sticker-price estimates for various income levels by school - or even a “need-aid ranking” since we all know one schools need is another school’s flush. I wonder if that is out there - a school ranking/guide that calls out broad strokes aid metrics; If gross HH income is 80-90k need aid is generally X? That sounds like a lot of work for very little result. Probably need a foundation/think thank (or University!) to do something like that.
Pricing schools is as hard as pricing a used car.
Not at all related to USNews, but there should be a more consumer friendly way of pricing colleges.
@Prof99, your statement about annual giving could be applied to a lot of private universities, too. Lots of Harvard alumni don’t give to the school because “with a $36 billion endowment, Harvard doesn’t need” their money.
Please also let us know how you would quantify “common sense” as applied to rankings.
If a rankings site truly cared about adapting to what matters to all students, they would measure many categories and criteria, collect the data, and then allow the student to choose how to weight each category. They could then also judge the flaws of each measure along the way. Based on what you think matters, different schools would come out on top. That isn’t very sexy, though.
@CaliDad2020 - your quotes don’t at all refute what I wrote. USN uses plenty of objective measures, and most of its metrics are quantitative. Literally the heading of its methodology page, which I quoted, says, once again:
Objective measures and opinions. Not sure how you get from that to “USN claims objectivity.”
And once more, @CaliDad2020 , I’m not here to defend USN. I haven’t ever even claimed it’s better!
What’s clear, though, is that whereas USN puts its subjective measures in a separate category from its objective ones, WSJ hides inadequate and non-representative salary data beneath a veneer of “ROI” objectivity, which is extremely misleading.
Since few people attend enough schools to be able to contrast schools’ academic quality, we have to rely on things like average class size, grad rates, awards, and (in USNews’ case) the academic rep survey. While obvious biases will exist, is there a better group of people to ask than college deans?
Regarding admit rate, i don’t think it means a great deal either. More important, IMO, is the quality of the class. If School A has an admissions rate of 5% and an average GPA/SAT combo of 3.88/1500, while School B has an admit rate of 7.5% but their admits are at 3.93/1540, tell me which really is the better class, and which school really is harder to get into.
@marvin100 I’m not going to bore everyone by back and forthing this. We can all see what USNews writes and many of us have read Morse talking about USNews methodology, so we can draw our own conclusions. And none of us really know whether the world at large believe that USNews is more objective, subjective or a combo of the two.
But I think they overweight peer reputation and it contributes to static ranking due to the same peers responding to rate their peers based on existing reputation which reinforces that existing reputation. But most folks reading here know enough to look beyond the USNews rankings anyway.
@prezbucky of there are - like folks that hire the grads (or, at least, grad school deans that accept the grads.) I mean, other schools deans don’t interact with the students either. And with a <40% response rate, you’re getting the ones that even bother to respond.
I follow the USN rankings closely, and have for almost two decades, and I haven’t seen claims that its rankings are objective. I’ve seen (unequivocally true) claims that its rankings include objective metrics, of course, and subjective ones.
I think this is a perfectly reasonable belief, and that reasonable people will have varying opinions about the roll that peer reputation should play in college decision making.
Yes, they happily say they use both objective and subjective methodology, but when [their own FAQ page](http://www.usnews.com/education/best-colleges/articles/rankings-faq) goes to great length to say that (a) better than three-quarters of the ranking is based on objective criteria (without getting into the mess that is that data, but I digress), and (2) that the remainder is based on expert opinion, well, you can see where it comes off as claims of objectivity.
Plus, people tend to view numerical scales as objective, even if they aren’t, just as part of how we’re wired—and a numerical scale is precisely what USNWR is producing.
So yes, for those that follow the rankings closely, sure, they’re not properly objective scales. USNWR, however, is more than happy if their primary consumers think their scale is objective, whether it is or not, and isn’t about to disabuse anyone of that perception.
It comes off as claims of 3/4 objectivity. Dunno what other conclusion one could reasonably draw, in fact.
I can’t accept claims of “human nature,” given that we know almost nothing about it (including whether it exists beyond the big three–Chomsky/Pinker’s innate language wiring, Dehane’s innate math wiring, and Wynn/Bloom’s innate moral wiring).
But while I’ve been following the rankings closely for a long time, I’ve also been working with people who haven’t. In fact, I spend a lot of time in person with parents and students, literally thousands of each over the years, and while I’ve met plenty of put too much stock in rankings, I can’t remember a single one who was under the impression that the rankings are really objective. Just the title alone, “Best Colleges,” makes it pretty obvious they aren’t–what worldview would be required for one to imagine that institutions as complex and disparate as colleges and universities could be objectively ranked as best?
But again, I don’t think the USN&WR’s rankings are particularly great. I came here to make a point–and if you read back you’ll notice that nobody has refuted it–that WSJ’s rankings are flawed despite the WSJ’s boasts about its inclusion of “outcomes,” because “outcomes,” it turns out, are based on woefully inadequate salary data.
And for those of you who don’t think subjective opinion should play a role in the rankings, well, the WSJ has you covered there, too, as it includes subjective “opinions about the quality of education.”
I just think it’s interesting that the survey that weights 22.5% of the ranking has an extremely low response rate that is only reinforced by the previous year of US News rankings. Like if I were an educator filling out that survey, I bet I would naturally consult a college ranking to rank colleges myself.
@marvin100, precision bias is a well-known thing. It’s what people do. And yes, people do it widely enough that it seems reasonable enough to call it “human nature”, even if we don’t know the actual cognitive processes that underlie such phenomena.
(And if you’re going to allow anecdotes to determine whether people think the USNWR rankings are objective or not, might I offer as counterevidence a good number of the discussions that take place on this very forum? Not that I think that’s good evidence, but it’s as strong as your impressions of what people you worked with might think.)
@HappyAlumnus, my suggested dose of common sense is not quantified and is not supplied by a ranking; it is applied by the consumer, as in UVA is, on balance, a better uni than American (admit rates to the contrary notwithstanding).