USNWR ranking methodology-- the nuts & bolts... Or is it just nuts?

There are definitely some “interesting” GC rankings in the lastest from USNews, e.g.:

Claremont McKenna (4.5) > Pomona (4.4)
Smith (4.6) > Swarthmore and Amherst (4.5)
Johns Hopkins (4.9) > Brown, Cornell, Dartmouth, Duke and UPenn (4.8)

from
http://colleges.usnews.rankingsandreviews.com/best-colleges/rankings/national-liberal-arts-colleges/high-school-counselor
http://colleges.usnews.rankingsandreviews.com/best-colleges/rankings/national-universities/high-school-counselor

Don’t want, need to see university rankings by step 7 of the onion article (with each variable listed): quote / ((Out-of-state tuition)(Number of West African fusion dance troupes)) + (Nobel Prize winning faculty members - Number of meal plan options)^Number of nicknames for dining hall

[/quote]

@hunt there are college counselors that charge big money from mainly rich family to help guide and land their kids at “top” schools.

So these counselors often manipulate the U.S. News list to target a few schools that are “highly ranked” to justify their fee. For example a rich student with a solid profile has a much better shot at Wash U than an Ivy. It was higher ranked than Cornell, right? Let’s work on getting your kid in there… Boom

dfbdfb, the point is not that GC’s don’t know anything about colleges. Many of them obviously do know some general things about the schools that are local and/or are popular with their students. But what they know is information like what majors are offered, what NCAA division the schools are, whether or nor the school has rolling admissions and ED or EA, and generally how selective admissions are at those schools. However, that has nothing to do with the quality of the schools, which is what the USNWR rankings seek to measure. What I was questioning is the extent to which the opinion of GCs is informed by actual input from past students which would give them an idea of how well the school does as far as administrative effectiveness, classroom education, and job placement.

@foosondaughter, those differences are so tiny, I’m not sure anyone thinks they matter. In any case, I don’t think what GC’s think really matter at all, regardless.

@Hunt My Chinese then-boyfriend at Emory (back when it was #20 instead of an abominable #21) told me he thought the US News rankings accurately ranked schools in the US and used the list to decide where to apply. I knew a couple of students at Emory who also used the list to choose which schools to apply to.

At my current school, ranked 108 by US News, I’ve never even heard mention of US News except by a professor who used it as an example of a previously mainstream printed publication switching to a mostly online format (no mention of the actual rankings however).

I think the rankings are helpful to get a general sense of educational quality and reputation, but no way would my child choose one school over another due to a slight difference in ranking. Within a certain range, I think fit is what matters to sensible people. That said, the general public must look at these rankings. After all, they are posted on the front page of msn and other news sites. Presumably also, the general public includes employers, so ranking does matter to us for that reason.

At this point, I think it’s mainly clickbait, like “10 most shocking Miley Cyrus pix.”

Yes to the claim in the first sentence, but the conclusion in the second sentence doesn’t follow from that claim. It is entirely possible—and I think this is the case for at least most of the people on this thread—for people to be very, very interested in who ranks where (and why) in the USNWR list simply because it’s a culturally important phenomenon, not because they’re invested in any particular USNWR ranking number as a personally important social signifier of any sort.

“Here’s a serious question: how do people actually use this list, and other similar lists?”

I like and have used the USNWR magazine for my kids as a consumer shopping guide. Short, simple, transparent data and cheap. More useful to me in building the application list for each of my kids as compared to the narrative college guides the size of a phone book.

I don’t pay too much attention to the summary rating by itself. The components that go into the ratings are mostly solid and useful. Side by side comparisons of admissions info, freshman retention, grad rates, cost, class sizes, etc. etc. Easy to zero in on the things that you care about.

For those of who have kids who aspire to high end college educations, there is a need to identify the institutions best able to deliver those educations. Perhaps not to the 0.01% precision level (A is just a touch better than B), but somehow, we need to make such identifications.

I suppose that one way to do so would be to closely examine a big chunk of data about Harvard (what they self-publish, plus other sources), take a tour of the campus, interview students and alumni, and then repeat the process for the local flagship U, and see which is better.

But wait, you say, that’s a silly strawman. Everybody knows Harvard is way better than the local flagship U.

Yes, they do. But how do they know that? Because the collective culture has absorbed many many individual data points over the years and come to the (reasonable) conclusion that Harvard is a lot better (in the academic sense), than the local flagship U. But are those collective culture data points any more accurate than a statistical methodology such as USN implements?

It doesn’t matter too much when making course comparisons between schools that either approach would widely separate (Harvard vs. flagship U.). But what about finer grained comparisons? Harvard vs. Penn vs. Hopkins vs. Emory vs. Northeastern?

Even if many of us know, in some general sense, that Harvard is the best of those 5, and the best or pretty close to it nationally, relatively few (especially OUTSIDE of the CC universe), have a great intuitive feel for, say, Hopkins vs. Emory.

I view the USN list as a useful way to form my own lists for our family. We will not visit or research all 3000 or whatever universities that there are in the US. We won’t even do it for the 200 or so that make USN’s main ranking list. But given our geographic location, and a broad set of other criteria of interest, we will likely pick some number from that USN list (and other lists, perhaps, and other sources such as counselor recommendations, college fairs, and the like), for further research, and perhaps visits.

And at some point, as our lists tighten up, we’ll want a sense of which schools are likely reaches, and which schools would likely carry greater academic weight, and yes, prestige, which can matter in various ways post-college. USN’s list is not perfect, but it’s a reasonable shorthand for all this stuff.

We actually liked the Fiske Guide and found it very helpful, but I used the USNWR rankings similarly to you, @northwesty, to help my son come up with a range of schools to apply to and to get some insight into how his high school counselor was leading him.

Once all that was done, however, it came down to what he experienced visiting the individual schools. He never asked about rankings and never cared. A lot of those top-rated LACs that I was so enamored with were pretty underwhelming for him, whereas he found a lot of “mid-tier” publics pretty impressive.

These lists are helpful tools to a point, but they’re absolutely no substitute for doing your due diligence by visiting and thoroughly vetting the specific programs your student is interested in.

My kids loved the Yale Daily News’ Insider’s Guide to Colleges. Its like Fiske from the students’ perspective - what its really like to go there.

So I went and looked up colleges, and found that right now my hyper-college-focused HS junior’s semi-shortlist (still too long, she has 16 on it), which I’m pretty sure she developed without recourse to the USNWR list, includes everything from a couple USNWR top-5s to a couple in the hundred-teens (plus her last-ditch ultimate safety, our state’s flagship, which gets an RNP). And you know what? At all of them (except said ultimate safety, which doesn’t have programs even really related to what she’s most interested in) she’d get a good education. In fact, after visiting most of them with her, I’d suggest that she might actually end up with more possibilities for faculty interaction and meaningful mentored research and such at the lower-ranked end of her pool.

This rush to quantify everything strikes me as silly at best, and possibly damaging. The US still has the best postsecondary system in the world—its dominance is being eroded, yes, but it’s still there. And that system has over 2,000 institutions (even just limiting the count to nonprofits that offer baccalaureates). Really, the top quarter of those are all going to give good educational results, at least as far as they can control things (since the student has to provide, e.g., their own motivation.) The best 100 or 200? We’re talking really, really good schools there. In fact, I’d argue that once you get into the top 100 of the US postsecondary system, however you measure that, you’re probably mostly making distinctions without a difference.

However, lists like USNWR’s, where we pretend that there’s a meaningful difference between, say, #23 and #57, create and then reinforce the problematic—to put it mildly—idea that meaningful distinctions can be made between institutions at that fine-grained of a level, which then affects the way institutions plan and who donates how much money and so on, and, of course, the big one, who decides to apply where. And if the best students decide to apply to #4 rather than #86 because #4 is ranked higher rather than because it has curricular offerings that look interesting, or faculty that seem interested in actually providing a rich educational experience, or even, yes, whether the institution has a huge climbing wall in the student union, well, then there’s a mismatch between what the student could get out of their experience and what institutions can provide, and that’s IMO a loss for both sides.

Seems like the goal and market of USNWR rankings is to try to conform to and confirm existing opinions about college rankings in aggregate, but fill in the blanks with respect to colleges not known to an individual reader who would otherwise tend to follow a ranking focused opinion about colleges.

The Methodology section of the ranking opens with this disclaimer:

So it’s not like people are “inventing” ways of how to use the ranking.

Overall school rankings may not make much sense for individual students, due to each school having different academic strengths and weaknesses.

Even an undecided student should consider whether a school has strong or acceptable majors in the various subjects (or a broad range of subjects) that s/he may be interested in.

I wonder what percentage of people actually use the rankings this way. Definitely at least a sizable minority that absolutely do not use them this way.

I think the USNWR book is easily worth the 99 cents I paid for the online version.

I’d probably pay that or more just for the ranking charts. I find those extremely handy when doing the Vandy vs. Emory vs. Tufts vs. BU vs. Fordham exercise.

You don’t need a guide to tell you about HYPS.

If youse pays your moneys, who cares how it is used? If folks’ primary/sole criteria is preftige, so what?

How is this any different than using a ranking for colleges with best Greek scenes? Or, best food? Best rock wall? Best dorms?

18-year-olds make decisions on all kinds of factors…why couldn’t USNews rankings be one of them? (Just because your or I might choose differently?)