Off hand, it feels to me like that measure is subject to possibly extreme compression among the more selective schools, making exact rankings among the more selective schools in this measure relatively meaningless.
In contrast, this might get more meaningful when you start looking at colleges where doing better than expected at graduating students could be a serious consideration.
So letâs say Tufts raw graduation rate is 94% but based on the demographic profile of its students (say it has a high proportion of mostly upper middle to upper class, 2 parent families with both parents having college or higher degrees), it should have been closer to 97%, it would be dinged relative another school with a 93% rate but whose demographics suggest 89%.
I know the person who is in charge of publications for a well known brand that does college publications and lists. He told me colleges overtly try to bribe him all the time. In his words, heâs lost track of the number of new cars heâs been offered.
The âoutcomeâ portion of their methodology is 70%, so high average salaries versus low net prices is going to dominate the results over everything else. The only reason those other factors come into play is there are a lot of colleges with relatively similar economic outcomes based on their data.
I wouldnât count on it. They previewed changes to their new methodology a few months ago and I donât recall seeing that change mentioned. The one change I recall is they will no longer factor class size into their ranking.
It all depends on one defines important. Important as in whether it objectively should matter, probably not. Important in that it motivates major strategic and tactical actions of colleges, then USNRW ranking is unfortunately very important whether it should be or not. Many colleges are making major decisions about how to spend their capital, what programs to offer, what to emphasis in admissions at least partially on the basis of how they think it will impact them in the USNWR methodology. Some of these colleges are open to bribing, cheating and other bad behavior because they deem it important. And they do all this because they believe that a meaningful subset of the student and parent population thinks its important.
Yes, but they are transparent about that. They are the first to note that there is no ranking that makes sense for everyone. They have consciously differentiated themselves with a heavy emphasis on economic outcomes, under the premise that the college is form of investment in ones economic future. For those with the desire and ability to invest in college without a critical need for it to pay off with a higher income than if they didnât go to that college, this is the wrong list.
Interesting. Does it make sense that two colleges in the same location in the same consortium (Pomona and Claremont McKenna) differ so much in terms of student experience and outcomes?
How did Pitzer, Scripps, and Harvey Mudd fare in comparison? (I canceled my WSJ subscription in a fit of pique and refuse to renew. )
If a national ranking list requires the reader to study the methodology in detail in order to gain a basic âunderstandingâ of many of the results, I think that is a problem.
So, in looking at the 2023 Washington Monthly rankings (which allows users to download the full data set) Tufts was expected to have a 94% graduation rate and it had a 94% graduation rate. According to the Washington Monthly data, Tufts actually performed better than many other Top X schools in terms of how its Pell graduation rate compared to its regular rate. (Data sorted by graduation rate rank.)
Rank
Name
8-year graduation rate
Graduation rate rank
Predicted graduation rate based on % of Pell recipients, incoming SATs, etc.
Graduation rate performance rank
Pell/non-Pell graduation gap
Pell graduation gap rank
1
Harvard University (MA)
98%
1
88%
21
-2%
41
5
Princeton University (NJ)
98%
2
94%
136
-5%
119
8
Yale University (CT)
98%
3
96%
173
-3%
68
4
University of Pennsylvania ¶
97%
4
94%
153
-3%
67
2
Stanford University (CA)
96%
5
95%
176
-3%
70
12
University of Notre Dame (IN)
96%
6
93%
131
-4%
92
43
Brown University (RI)
96%
7
94%
172
-4%
86
28
Dartmouth College (NH)
96%
8
92%
126
-4%
106
3
MA Institute of Technology (MA)
96%
9
99%
324
-4%
89
6
Duke University (NC)
95%
10
97%
262
-3%
58
10
Cornell University (NY)
95%
11
96%
256
-4%
85
31
Northwestern University (IL)
95%
12
94%
209
-3%
63
32
University of Chicago (IL)
94%
13
98%
332
-2%
47
15
Georgetown University (DC)
94%
14
90%
115
-3%
78
42
University of Virginia (VA)*
94%
15
90%
127
-3%
84
99
Tufts University (MA)
94%
16
94%
222
0%
15
95
Rice University (TX)
93%
17
97%
334
-3%
76
9
University of CAâBerkeley (CA)*
93%
18
95%
268
-5%
130
27
Washington Univ. in St. Louis (MO)
93%
19
93%
202
-1%
24
35
California Institute of Tech. (CA)
93%
20
100%
385
1%
14
13
Johns Hopkins University (MD)
93%
21
96%
292
-2%
32
7
Columbia Univ. in the City of NY (NY)
93%
22
90%
141
-2%
44
18
Vanderbilt University (TN)
93%
23
93%
232
-3%
77
23
University of MIâAnn Arbor (MI)*
93%
24
90%
143
-6%
140
47
University of Southern CA (CA)
93%
25
92%
220
-2%
46
16
University of CaliforniaâLA (CA)*
93%
26
93%
246
-5%
117
50
Emory University (GA)
92%
27
93%
248
-2%
35
41
Boston College (MA)
92%
28
90%
178
-2%
36
69
William & Mary (VA)*
92%
29
88%
119
-1%
29
But when we reorder the list based on the Graduation rate performance rank (i.e. percentage graduating vs. the percentage expected to graduate), that list looks a lot different and we see some of the surprise schools, like NJIT, from the WSJâs list.
Rank
Name
8-year graduation rate
Graduation rate rank
Predicted graduation rate based on % of Pell recipients, incoming SATs, etc.
So it looks like what pulled Pomona down was actual salary vs expected salary and the Pomona students surveyed dinged the career prep resources vs the Claremont kids.
I donât understand. Although Pomona seems to have a lower actual salary vs expected salary comparison, its net price is less than 1/2 of Claremont and the Years to Pay Off is shorter, 9M vs 1 Y 1M. It also has a higher graduation rate. That accounts for a 40 place difference in ranking?
The weighting of the Salary Impact is 33%, Years to pay off 17% , Grad rate 20%. Also the raw score for Claremont was 88 vs 78.8 for Pomona, and the raw score difference between #1 vs #20 is only 8.2 so there is a high degree of raw score compression at the top.
That makes sense to me because from what I understand, people looking for high-paying finance and consulting jobs are more likely to choose CMC, people looking for graduate school would be more likely to choose Pomona (either might be fine for law school).
Of course it is fair to say that doesnât make CMC a better college than Pomona for all people. For aspiring subscribers to the WSJ, however? Sure, makes sense to me.
From what I understand, people donât always apply to both Pomona and CMC, they sometimes choose to apply to only one or the other. For such people, the one they choose could be ranked very high, and the one they donât choose is essentially unranked. And then some people do apply to both, and for them some sort of ranking might matter if they get into both.
And again, it looks to me like a PhD-intending kid could easily fill up a whole list with Pomona and other schools but no CMC, and an IB-aspirant kid could easily fill up a whole list with CMC and other schools but not Pomona.
To me, this ranking is speaking to the IB-aspirant sort of kid, at least in part. Because, you know, it is the WSJâs ranking.
Is that insane? To me it is no more insane than that IB feeder list I linked. It is just data, and what you do with that data is up to you. It may be relevant, it may be irrelevant, and so on.
Now I think you are right that things like this are subject to self-selection, and that is a real complicating factor. Take the same individual with the same goals, is the road more traveled by similar individuals with similar goals always easier than the road less traveled? Not at all clear.
But while I think that is an important point, people who choose the road more traveled are not, to me, insane. There is comfort in knowing that a college obviously works well for a certain purpose, that likely there are courses and programs and professors and advisors who regularly help people like you achieve goals like yours. And if you decide that comfort is worth enough to put some schools on your list, and exclude others, then again I think that is sane. Not necessarily the only sane approach, but one of them.
This is where the aforementioned ranking compression issue is very relevant.
I think we are primed to think a difference of 40 spots in a ranking is supposed to mean a lot.
But there are something like 2800 four-year colleges in the United States. In that context, 40 spots might mean very little.
Personally, I would love if people looked at situations like this, thought about it, and realized how that sort of ranking difference between CMC and Pomona is really not particularly significant, that it is well within the range where rational people could prefer one or the other for their own reasons.
Because I would hope they could then apply the same sort of logic to, say, the US News rankings, and realize the same issue applies to similar differences in those rankings.