2022 USNews Rankings posted

OK, I lied. I was bored, and a few keyboard shortcuts let me grab the peer assessment score of the top 100 National Universities.

Top 10
1 - MIT
1 - Stanford
1 - Columbia
4 - Princeton
4 - Yale
6 - Harvard
6 - Northwestern
6 - UC - Berkeley
9 - Chicago
9 - Penn
9 - Caltech
9 - Rice

Significant movers - Peer ranking vs. overall ranking:

Top 20 Peer ranked, up 5 or more spots from National rank - Stanford, Berkley, Rice, Michigan

Top 20, down 5 or more spots - Duke, Vanderbilt

20-50, up 10 or more spots - Georgia Tech, UT-Austin, Wisconsin, UIUC, Ohio State, Washington, Purdue, Virginia Tech

20-50, down 10 or more spots - WashStL, Cornell, Florida, Tufts, Wake Forest, UC_Santa Barbara

50-100, up 25 or more - Colorado, Iowa, Arizona, Oregon, UC-Santa Cruz

50-100, down 25 or more - Rochester, Lehigh, Santa Clara, WPI, Yeshiva

(As I only recorded the top 103, there may be other that would fall further, below those with National ranking of 100+)

7 Likes

Can you do it for LACs?? :pray:

1 Like

Looks like the publics get more love from their peers than USNWR.

Gee, didn’t see that coming :grinning:

Top 10
1 - Williams College
2 - Amherst College
2 - Swarthmore College
4 - Pomona College
4 - Wellesley College
4 - Bowdoin College
7 - Harvey Mudd College
8 - United States Naval Academy
8 - Claremont McKenna College
8 - Carleton College
8 - Middlebury College
8 - United States Military Academy
8 - Davidson College
8 - Grinnell College
8 - Smith College

Significant movers - Peer ranking vs. overall ranking:

Top 20 Peer ranked, up 5 or more spots from National rank - Harvey Mudd (up 21), Davidson, Grinnell, Smith, USAF, Vassar, Bates, Bryn Mawr

Top 20, down 5 or more spots - none

20-50, up 10 or more spots - Spelman, Reed, Rhodes, St. Olaf, Bard

20-50, down 10 or more spots -Washington & Lee, Hamilton, Berea, Whitman

50-110, up 20 or more - Lewis & Clark, Wheaton College (MA), University of Puget Sound, Hendrix College, Ohio Weslyan, Southwestern University, St. John’s (MN)

50-110, down 20 or more - Thomas Aquinas, Juniata, Soka U of America, Hillsdale College, Principia (all but one of these down 50+ positions)

5 Likes

I’ve actually filled one of these out. (At least at the professional school level, administrators aren’t the sole recipients of the survey). You’re right - there is some circularity to the peer assessment. But it also reflects on-the-ground knowledge about faculty hiring. Did one school recently lose all of its superstars? Yeah, that’s a bad sign. Has another school suddenly hired a bunch of new professors? Perhaps because it recently received a transformational gift? Give it a better score. Is one school resting on its laurels, fighting with its donors and not hiring? Really bad sign - that’s how a school falls to the bottom of its tier.

So the peer assessment isn’t perfect, but it can capture momentum in either direction. And it reflects information that insiders are aware of but which haven’t been fully announced or understood by the consumer market.

2 Likes

Flawed as it is, I prefer the USNews ranking to Forbes’ ranking because at least USNews tries to measure, and give weight to, the academic strength/quality of a school – faculty strength, teaching quality, overall academic strength/rep.

Looking through Forbes’ variables, none of them directly measure academic strength.

2 Likes

Just yesterday there was an article about UC Santa Barbara (ranked 28th) reporting that there are not enough beds for students forcing kids to living out of their cars and in hotels. And “Many Undergrads Looking for Full 12-Unit Class Load Are ‘Entirely Out of Luck’”!! And for the students who can’t get enough classes to qualify for a full load, they lose their financial aid! And if kids can’t even get 12 units, how many of the units they can get do you think are classes they actually want?! “We are again in our annual fall enrollment crisis, as we have been every fall since 2015,” Jeffery Stopple, Dean of Undergrad Education. And UCSB isn’t the only one, there are stories like this about UCLA (ranked 20) for one. How is this school considered the 28th best school in the country?! Come on! And this story came across my radar because I live in CA, but surely this nonsense exists other places.

6 Likes

Sounds like a Jaguar. Prestigious but in the shop half the time.

5 Likes

According to US News there is a solution to this problem. The UCs can accept less students, dropping their acceptance rate which in turn will make them more desirable dropping their acceptance rate even more and helping them climb the rankings ladder. But I don’t think that’s going to happen. My daughter’s friends are freshmen at Purdue and they also didn’t have enough housing for their freshmen this year. They turned office space into living spaces for some students. One of her friends landed in an OK situation, one not. And this probably was because Purdue is up and coming in these national rankings. It’s always been a good school but now it’s on most Californian students radar also.

I saw pictures of those cubical-like “rooms”! Crazy man.

2 Likes

For some schools, their housing problems are exacerbated by the fact that they count on a certain percent of students to be studying abroad or otherwise off campus, but those opportunities aren’t available or have too many hurdles, so everyone wants/needs to be on campus.

3 Likes

Also that in effect they have almost two and a half classes of freshmen. The freshmen, sophomores (that haven’t even set foot on campus yet at least at the UCs) and the transfers. That’s a lot more students to house than normal. This has been an interesting couple of years. Sigh


1 Like

There are dozens of college ranking lists – USNWR, Forbes, WSJ/THE, QS, Niche, Money, Washington Monthly, Princeton Review, Wallet Hub, Kiplinger, Economist, Payscale, etc. Almost all of them select arbitrary weighting, have no validation of accuracy, and produce output with little meaning. This makes trying to rank the ranking lists have little meaning.

That said, I don’t think USNWR measures most of the listed criteria well. For example, you mentioned teaching strength. USNWR gives 7% weighting to faculty salary, but salary is not the same as teaching strength. Faculty salary is probably more correlated with getting higher profile staff who have excelled at grad/research type activities in the past, rather than best at undergraduate teaching. Teaching quality is also going to vary tremendously across majors rather than be consistent throughout the college. It is a difficult criteria to measure. However, there are other related variables than can be more easily measured involving undergraduate experience in a particular major at a particular college, 
 things an individual student could look for, but not the type of thing that would work for a simple ranking list to be meaningful for all students in the country/world.

It is not possible for any ranking list to capture what is important for a particular student. For example, price is a key variable for most students in the United States. However, price is highly variable from one student to the next due to merit and FA reductions. For example, most students do not pay sticker price at Harvard. Instead there is a wide variation with ~$0 cost to parents for 20% of students and ~$320k for 4 years for other students. At certain other colleges >90% of students get a large and variable discount from sicker price. A single ranking list may consider sticker price or average price, but it cannot capture this wide variation in price for different students. Instead an individual student needs to consider their own price to rank colleges.

It’s a similar idea for most other variables that would be important to particular students. For example, when I was a HS student, I thought a likely major was electrical engineering. So quality of electrical engineering was critical for me, yet this criteria was typically not considered in ranking lists. The majority of well ranked colleges either did not offer EE or had subpar EE in my opinion.

Rather than focusing on a college’s number on an arbitrary ranking lists, I’d suggest focusing on this type of criteria that is important to you.

8 Likes

It’s worth noting, however, that many colleges rely heavily on a rotating cast of adjuncts and non-tenured lecturers about whom professors at other institutions can say little. This is unfortunately the case even at elite colleges – approximately 40% of Chicago’s instructors are NTT, for instance.

3 Likes

Fair point and I agree that USN ought to reflect the breakdown of TT vs Adjunct and the breakdown of in class instruction vs online. And btw, I do think administrations have some sense of whether schools are over staffing with adjuncts. That’s the kind of thing a lateral candidate will mention (“I want to move to School X because it’s investing in the TT faculty.)

Peer assessment is far from perfect but it still can convey something meaningful - it tells you whether an institution’s peers think it’s on the rise, going in the wrong direction, or making a lot of claims that are full of hot air.

1 Like

Barnard’s rise of five places, from 22nd to 17th, seems worthy of mention. With respect to other Sisters, Holyoke closed a six-place gap with Bryn Mawr, for a tie at 30th.

3 Likes

Yes, the sophomores who did COVID-distance-college frosh years are more likely to choose the dorm housing rather than off-campus housing that sophomores who lived in dorm housing frosh year and became familiar with the local area enough to choose off-campus housing. True frosh and new transfers would be the same as before added to that. Seniors who were new transfers last year could theoretically be like sophomores, although transfer students are less likely overall to choose dorm housing (even as new transfers).

1 Like

It sounds like you tried to fill out the survey well, but not everyone does so. For example, the article at https://www.insidehighered.com/news/2009/08/19/reputation-without-rigor+&cd=2&hl=en&ct=clnk&gl=us mentions the University of Wisconsin’s Peer Assessment ratings of 262 “national universities” were as follows.

5 = “Distinguished” – University of Wisconsin (school he works for), New School of NYC (school his son attends)


2 = “Adequate” – 259 schools including Harvard, Yale, MIT, etc.
1 = “Marginal” – Arizona State (he said AZ state was hit hard by the economy)

The administrator filling out the survey said that he was only especially knowledgeable about University of Wisconsin, New School of Florida, and Arizona state, so those were the only 3 he rated differently than his default of “adequate.” He is quoted as saying, “The problem with an overall ranking of colleges is that without set criteria, you don’t know what it means,” There have been other administrators who claim they were instructed to mark competitors low in an effort to boost their college’s ranking over competition.

However, even when administrators attempt to fill out the survey fairly, how many of the hundreds of colleges on the survey do they know well enough to give a fair rating? Even if they are familiar with a college, what does a rating of 5 = “distinguished” even mean? I expect using vague language like this means different things to different people filling out the form, resulting in inconsistent ratings among different knowledgeable persons.

In theory, you could look at the historical peer assessment ratings and see which colleges are on the rise or decline, However, almost nobody does this since you have to pay $40 to see the ratings, and after paying the $40 you can only see the current ratings, not the previous ratings to see how they have changed.

2 Likes

I agree the gaming is awful (I thought there was some sort of algorithm to check for that but maybe I’m naive). Also, I was filling out the survey for a graduate level program - same problems in terms of what counts as a 5 or 4, but I at least was familiar with 3/4 of the schools surveyed.

I’m not sure what the answer is, but I do value some input from those in the know.

But that’s part of the issue
we have no idea who filled out the survey (we only know that 34% of the directed were completed). Sure it could have been someone with a high level position, but I question how many Presidents and Deans would spend time filling this out. As noted above, many survey recipients give it to a staffer to complete. For example, an admissions officer (or other far-from-experienced staffer) might have filled out the survey, I just don’t see how that could lead to worthwhile, or even directional, findings.

1 Like