Yep - for the SLACs, I remember our kid crossed off maybe 7 or 8 of the top 25 simply because a 50/50 (or 60/40) co-ed college experience was non-negotiable. Every student has to make their own list based on what is important. And visit those campuses if you can!!
The rankings are out there because that’s what the public wants. There are rank lists for everything (e.g. cars, hospitals, doctors, restaurants, household products, beaches, vacation destinations, towns to live, places to retire, jobs, movies, books). Who hasn’t looked at rottentomatoes, consumer reports, or yelp and sorted by ranking/rating to figure out where to go, what to buy, and what to watch? There is obviously some utility. Otherwise, no one would pay attention.
Which would make sense, because I think Emory and Vandy (and Rice, CMU, Notre Dame, Georgetown, and Wash U) are academic peers.
Yep – that’s a head-scratcher. As is Reed ranked in the 60s or 70s, but then they revile the ranking and don’t participate.
And Oberlin at like 49th – ridiculous.
No, the 2013 version also relies on the data from HS class of 2000. So it may be the same paper.
It’s interesting to see Stony Brook ranked 58 and Binghampton ranked 73. Bing on here is often looked at as the state flagship. Stony Brook has always had a solid academic reputation, especially in science and tech. I would consider them peer schools.
Well, I think the choice of source is already questionable. I’m not convinced that university presidents (who spent their time with legislators, architects and major donor circles), or Deans of Admission (= Director of Sales & Marketing) would even be the most authoritative on their own university’s academic excellence.
They might not be academics, nor have ever even taught at their current employer – IF at all!
I’m not gonna ask the guys in the showroom, about the reliability of their cars (whichever marque they just started selling this year). I’ll talk to the service manager.
It’s not just USN’s ranking. Reed ranks #108 in Niche, #326 in Forbes, and isn’t ranked at all in WSJ (which goes to #400 – there’s a couple other weird absences there too, like Harvey Mudd).
And then there’s it’s raw stats. It admits over 40% of applicants and has a very poor yield rate in the mid-teens (compare that with a peer like Bowdoin in the mid-60’s). Yield has consistently trended down from a peak of 31% a dozen years ago and acceptance rate has more or less hovered flat for the last 15 years while many peers halved or more their admit rate in the same time frame.
Besides the poor response rate to the survey and whether the people who answer it are authoritative, who is to say they aren’t gaming this system like so many attempt to do with every other aspect of the rankings? It’s hard to imagine that NE isn’t strategic about who it up and down rates. Has anyone read whether the USN has some way to compensate for manipulation?
Yes, Reed is few students’ first choice school.
I assume they are depending on averages across hundreds (or dozens) of responses from different schools compensating for any aberrations caused by one school trying to manipulate the peer rankings.
Both schools made big jumps, IIRC they were in the 80s last year. The “gap” between them is likely largely explained by Pell rates: SBU is 39%, among the leaders nationally, whereas Bing is 26%. I suspect SBU probably fares better in the faculty research categories, its graduate programs are on a totally different level from Bing’s.
That explains Columbia - HYP must have intentionally given the #7 and #9 twins & triplets extra high scores
Maybe those things matter to some publications’ ranking formulas, but if we’re talking quality of teaching/academics, Reed belongs much higher IMO.
Even newly minted UC Merced is ahead of Tulane.
Agreed, but this is why these sorts of generic rankings are really pointless.
Like, as we discussed in the context of the WSJ rankings, I completely believe that Florida is a great social mobility investment for kids from lower-income Florida households.
However, the relevance of that observation to non-Floridians is, shall we say, somewhat questionable.
But I don’t want to “punish” Florida for doing a great job at its actual mission either.
A ranking scheme means I have to choose–which colleges should I punish for doing a really good job at what they are doing?
And to me, the obvious answer is all those colleges are in fact doing a really good job–for certain kids and parents.
This isn’t what’s actually happening, though?
Schools are being “penalized” for having too few Pell-eligible students, who make up 35% of college attendees nationally, but particularly for not graduating such students at rates comparable to their non-Pell students, which doesn’t seem like an incredibly high bar given the pre-screening that occurs and the resources available.
That’s a really fun visual. I never really thought of college academic rankings in context of the athletic conferences
In our feederish HS, a lot of the (not recruited athlete) kids seem to gravitate toward what I think of as certain respective branches on the college family tree. Obviously each kid can see the tree as they like, and it is very common for certain kids to, say, mix together some LACs and some medium-sized universities on their lists.
In this framework, rankings really don’t matter much. I mean, maybe the kid will in some way be influenced when considering the colleges on their tree branch (although even then, I think it more becomes a matter of the data behind the rankings than the rankings per se). But entire large categories of LACs and universities simply aren’t considered. Not because they are “bad”, they just aren’t on the right branch of the tree.