The Everlovin' Undergraduate-Level University Rankings

BobShaw and theloniusmonk, to add to your posts, both of which are very valid, two of the largest components of the US News ranking (financial resources and faculty resources) are very easily manipulated, prone to inconsistencies in data-reporting, and impossible to evaluate in absolute terms, and yet, that is exactly what the US News does. It compares universities that omit graduate student populations larger than their undergraduate student population from their student to faculty ratios. It pretends that the financials of a 50,000 student university can be compared geometrically to those of a 3,000 student university, completely ignoring the most basic principle of economies of scale. It compares financial aid rewards at a university that charges an average of $50,000 in tuition to one that charges an average of $14,000 in tuition. It compares faculty salaries in universities where 60% of the faculty are Business or Engineering professors, to universities where 60% of the faculty teach humanities and social sciences, or salaries of faculties in expensive urban areas to salaries of faculties in affordable rural or suburban areas etc…

As the saying goes, “garbage in, garbage out”

yeah faculty salaries are so misleading that you would need a thesis to explain them. The best professors in business consult on the side, usually in the summer but also in one semester, meaning they only teach for one semester, consult or visit another university the other. Now this means they take a lower salary to be able to have this arrangement because they know they’ll make it back up in consulting fees.

Engineering and science professors could have a similar arrangement, in fact the good ones will bring in outside money (e.g. NIH, industry partner) to fund research that can benefit undergrads as well, even though conventional thinking is it only goes to the grads.

Yup, and you can bet an equally talented Stanford and Duke professor are not getting paid the same.

That’s exactly the point I was trying to make.

I took the top 40 (or so) national universities from USNews and used the average of the poll results on undergraduate teaching quality in Niche to produce a ranking of teaching below. The questions were about how much effort professors put into teaching, how passionate they are, how much they care about student success, how easy they are to understand, and how helpful the professors are. While I know everything has to be taken with a grain of salt, I found it interesting that the results largely came out where I thought they would. If you look at Ivy League only, the order is Dartmouth, Brown, Princeton, Penn, Harvard, Yale, Columbia, Cornell. That is what I would have guessed with the exception of Penn, which I thought would be lower.

Stanford did quite well at number 5, behind only Dartmouth of the Ivy schools. Rice did really well, which did not surprise me. The UCs didn’t fare well, and neither did Hopkins.

I didn’t include LACs in the list below, but I looking at poll results for some top LACs (Amherst, Swarthmore, Wellesley, Williams, and W&L [I picked W&L because it did well in value add analysis]) and found that they did better than all of the national universities with the exception of Wellesley, which rated only behind Wake. Of the top 4 national universities, I note that 3 of them, Wake, W&M, and Dartmouth, are perhaps closest to LACs in character of the national universities listed.

The differences between some of the adjacent schools was quite small, but top to bottom difference was considerable.

1 Wake
2 Rice
3 W&M
4 Dartmouth
5 Stanford
6 WashU
7 Brandeis
8 Tufts
9 CMU
10 Chicago
11 Brown
12 Notre Dame
13 Princeton
13 MIT
15 Duke
16 Caltech
16 BC
18 Northwestern
19 Penn
19 Vanderbilt
21 USC
22 Emory
22 Tulane
24 UVA
24 Michigan
24 Case Western
27 Georgetown
28 Rochester
29 Harvard
29 NYU
31 UNC
32 Yale
33 Northeastern
34 Columbia
35 Cornell
36 UCSB
37 BU
38 RPI
39 Berkeley
40 UCLA
41 UC Irvine
42 Hopkins
43 Georgia Tech

Very interesting ranking, it will however definitely work against large state schools, this could be a ranking where you combine lacs with private into one ranking and have publics be the other ranking. Not saying you should do it, but that could reveal more. Columbia being that low is shocking a little, though the atmosphere there is very competitive, so maybe the teaching quality contributes to that.

For the top 5 schools in the above teaching quality ranking, the average number of Niche responses was 58 (with a range from 28 to 79 responses.) Evidently, 27 of 28 surveyed W&L students agree that “professors are passionate about the topics they teach”. OK … but what’s their basis for this conclusion? What does “passionate” (“caring”, “engaging”, or “approachable”) even mean?

I checked responses for a few other colleges, more or less at random. The “professors are passionate” statement gets 90% or higher agreement among students responding from NYU, Case Western, UMd-CP, and Frostburg State. It gets 100% agreement among students responding from Dordt College in Sioux City, IA. Reality, or Lake Wobegon effect? How would we know?

Here’s an example of another approach to assessing the quality of student-faculty engagement (and the overall undergraduate learning experience):
http://nsse.indiana.edu/pdf/survey_instruments/2017/NSSE17_Screenshot_US_English.pdf
(However, AFAIK, the NSSE does not publish scored and ranked results of this survey. )

I enjoy some of the banter here in this spirited discussion.

Let me simply add I am not sure how one could rank the best undergraduate programs without distortion from the graduate level Programs.

Perhaps it would be most meaningful for prospective students (that should be the goal) if this was instead several undergraduate lists which were rankings for:

  • business programs
  • STEM Programs
  • humanities programs
  • Performing Arts
  • etc

What this would highlight is if a School say Harvard if it does not have an undergrad biz program they wouldn’t show up in the business program ranking.

The most innovative programs and the ones with the best outcomes - say for STEM would top that list.

In this way without a doubt these lists would look quite different from the usnews ranks that some here have totally bought into…

Yes, it’s very challenging to rank undergraduate programs.
I’m not sure it becomes less challenging by breaking it down into broad subject areas.

US News already does undergraduate business and engineering program rankings. Rugg’s Recommendations gets more detailed (although it doesn’t rank recommended schools against each other). All these assessments rely on opinion surveys. On what basis would a Harvard professor/dean/president rank, say, Cornell’s undergraduate humanities programs against Penn’s?

If we’re going to use outcome metrics, which ones? How do we measure humanities outcomes?
Not by alumni earnings, I hope.

@tk21769 it’s above my pay grade. I think there should be a non us news entity to rank in some very top level breakouts. I have seen the “music” sample page for Rugg and it doesn’t seem to make any sense.

ClarinetDad, in order for a ranking to be viable, it must:

  1. Identify a relevant methodology
  2. Audit data for consistency and accuracy
  3. Make allowances and adjustments for the different types of institutions (size, affiliation, location etc...)

“undergraduate teaching quality in Niche to produce a ranking of teaching below.”

You need to correct for the school’s droput/graduation rate. Also comparing STEM universities and liberal arts universities is apples and oranges.

The schools that have 95+ graduation rates… that simpIy means the school is not difficult, usually low STEM, and the students are buying a degree.

When you have a combination of high scores, high STEM, and high dropouts you have a difficult college. You will also have lower GPA, lower satisfaction and a desire to blame someone. I’m not saying that a higher dropout rate means it’s a better college, however all these schools have high scoring smart students. If kids are not finishing, it’s because of the workload.

Ranked by Dropout rate

rank niche university drop% ***High STEM Universities

  1. 33 Northeastern 22
  2. 24 Case Western 21 ***
  3. 43 Georgia Tech 20 *** ==== less than 20% dropout rate
  4. 36 UCSB 19
  5. 28 Rochester 16
  6. 38 RPI 15 ***
  7. 37 BU 15 ==== less than 15% dropout rate
  8. 41 UC Irvine 14
  9. 29 NYU 13
  10. 16 Caltech 12 ***
  11. 1 Wake 12
  12. 9 CMU 11 ***
  13. 40 UCLA 10
  14. 31 UNC 10
  15. 24 Michigan 10 ==== less than 10% dropout rate
  16. 7 Brandeis 9
  17. 39 Berkeley 9 ***
  18. 21 USC 9 16 22 Emory 9
  19. 8 Tufts 9
  20. 3 W&M 9 22 19 Vanderbilt 8
  21. 16 BC 8
  22. 42 Hopkins 8
  23. 10 Chicago 7
  24. 2 Rice 7
  25. 35 Cornell 6
  26. 34 Columbia 6
  27. 27 Georgetown 6
  28. 18 Northwestern 6
  29. 6 WashU 6
  30. 24 UVA 6
  31. 13 MIT 6 ***
  32. 15 Duke 5 ==== less than 5% dropout rate
  33. 19 Penn 4
  34. 13 Princeton 4
  35. 11 Brown 4
  36. 4 Dartmouth 4
  37. 12 Notre Dame 4
  38. 32 Yale 3
  39. 5 Stanford 3
  40. 29 Harvard 2
  41. 22 Tulane 2

I’ve long thought the USNews undergraduate rankings 1) are open to manipulation in almost every category by the institutions and the manipulations do not typically benefit undergraduate education (e.g. counting graduate faculty who do not tech undergraduates in ratios, emphasizing SAT over holistic admissions, abuse of class ranking, which is only provided in about 30% of cases, etc. etc.) 2) has an undergraduate reputation rating which is probably distorted by graduate programs and research reputations and is increasingly perpetuated from prior year rankings 3) result in differences between ranked schools (e.g. 15 and 16) that are too small to be significant but may yet influence applicant behavior by leading them away from going for best fit 4) has provided some fodder for the huge increase in cost because it emphasizes resources over efficiency and provides an obvious marker for ambitions for institutional prestige.

That said, it is here to stay.

Regarding the Niche results, what I noticed is the most consistently high answer was the passion of the professor for the topic. The widest differences were in how helpful, caring, and understandable the professors were.

@Greymeer , good points. (Just to clarify, I wasn’t really going for how I would rank - I got eviscerated earlier for trying – I was just looking to combine the Niche and USNews views to see what it would tell us.)

I do think there is more nuance in the STEM vs. non-STEM (or more broadly difficult vs. easy) than what you showed. Schools like Cornell, Rice, Hopkins, Michigan, and Princeton are pretty high in STEM (perhaps higher than Berkeley for instance), but not marked as such in your list. Others may not be very far off. Non-STEM study can also be challenging depending on the institution.

You also say high graduation rate = easy and buying a degree. Perhaps in part, but it is also pretty highly correlated with selectivity.

There is also another likely element to the correlation of STEM and teaching ratings. STEM professors, at least at larger research universities, may be more likely more involved in research that distracts from a focus on undergraduate teaching. They may also not be as good at clearly communicating.

https://www.usnews.com/education/blogs/college-rankings-blog/2013/06/18/top-ranked-universities-that-grant-the-most-stem-degrees

I imagine it’s (grad rate) a function mostly of selectivity/quality of students, rigor, difficulty of satisfying dist requirements, the motivation of those students to graduate, their ability to pay, and the academic support offered by the school. Grading policies – tough to say. That might have a bit to do with it, depending on the school.

Rigor at any school can be a school-wide thing – like a veneer covering all the furniture; or a major-specific thing – some of the furniture is teak, some is pine, some is cork, some is oak…

I personally don’t think the USNews ranking can be manipulated to boost your ranking that easily when you get to the top 20 schools.

Even beyond the top 20, its hard to make huge gains across tiers. Northeastern is the only school I know that has made huge gains.If changing your rank by playing with a few metrics were that easy, there would be wild swings in the rankings each year. One of the things I like about the USnews rankings is that it has remarkably stable tiers

http://publicuniversityhonors.com/2015/06/13/u-s-news-national-university-rankings-2008-present/

Now one can dispute if they measure the right things, but it isn’t easy for Universities to just game a few metrics and make huge gains in the rankings.

The easiest thing to manipulate, and with relatively strong weight, would be to increase the test scores of your admitted class. That’s worth around 8% of the total USNews formula. If those kids are also in the top 10% of their class, that’s another 3% of the formula. So you could improve your performance by becoming less holistic. BUT the schools in the top 20 already admit mostly high-stat kids, so upward progress based solely on admissions changes would be slow.

prezbucky, I think universities can manipulate student to faculty ratios, class size reporting and financial statistics.

I just thought I’d combine dropout rank and niche rank. Maybe this means something…

Ranked by Dropout rate + Niche - “It’s REASONABLY difficult and satisfying?”

rank drop+niche university drop% ***High STEM Universities (public)

1 11 Wake 12
2 19 W&M 9 (public)

3 21 CMU 11 ***
4 23 Brandeis 9
5 24 Tufts 9
6 26 Caltech 12 ***
6 26 Case Western 21 ***
8 27 Rice 7

9 33 WashU 6
9 33 Rochester 16
11 34 Northeastern 22
12 35 Chicago 7
12 35 USC 9
14 37 Michigan 10 (public)
15 38 BC 8
15 38 Emory 9
15 38 NYU 13
18 39 Dartmouth 4

19 40 MIT 6 ***
19 40 UCSB 19 (public)
21 41 Vanderbilt 8
22 43 UNC 10 (public)
22 43 BU 15
24 45 Stanford 3
24 45 Northwestern 6
26 46 RPI 15 ***
26 46 Brown 4
26 46 Georgia Tech 20 *** (public)
29 47 Notre Dame 4
30 48 Princeton 4
31 49 UC Irvine 14 (public)
31 49 Duke 5

33 51 UVA 6 (public)
34 53 UCLA 10 (public)
35 54 Georgetown 6
35 54 Penn 4
37 55 Berkeley 9 *** (public)

38 61 Columbia 6
39 62 Cornell 6
40 64 Hopkins 8
40 64 Tulane 2

42 71 Harvard 2
43 72 Yale 3

@Alexandre re: 136

Easily, though? To lower S/F ratios they’d have to hire additional faculty or reduce the number of students. Either will hit them in the wallet.

Changing class sizes would require some planning – additional labor.

Bending financial stats would require additional accounting labor (right?) or at least dishonesty.

Meanwhile, it would be fairly easy to admit kids based on stats, I would think. In fact, if they are concentrating less on the qualitative parts of the app and more on the quant parts, it would very likely reduce the amount of time necessary to evaluate an applicant. Numbers? Boom, done.

“Easily, though? To lower S/F ratios they’d have to hire additional faculty or reduce the number of students. Either will hit them in the wallet.”

prezbucky, not at all. Most private universities went them 11:1-14:1 ratios on 1992 to 5:1-7:1 ratios in 1994. They did not hire additional faculty, they simply stopped including graduate students in their calculations.

“Changing class sizes would require some planning – additional labor.”

Again, not really. Universities started flooding their course catalogs with seminars. Hundreds of them. They require virtually no additional effort (or labor) on the part of faculty or the students.

“Bending financial stats would require additional accounting labor (right?) or at least dishonesty.”

Definitely. That is the point of “manipulation”…it is indeed dishonest. But it happens when existential threats lurk about. How else does a university’s financial resources rank leap dozens of spots in the period of 2-3 years?