Aggregated College Ranking

Agree 100% with NiceUnparticularMan.

Especially that it’s not just academic careers (folks on CC seem to assume that every smart kid is going to end up with a PhD- including the kids! which is ludicrous, both based on statistical outcomes AND our own observations about life and people).

Once an ROI measurement is in place which is tightly tied to Big Tech and Finance, any college which attracts the “life of the mind” type kid is going to slip. With the notable exceptions of U Chicago and its ilk. They have done a good job of sending their "classically trained econ majors’ who study under Nobel prize winning theorists straight to hedge funds and banking (without every taking a finance course, imagine that!).

4 Likes

And even Chicago is arguably underrated by these new-phase rankings. 14 by median, 12 by average, and at least some would argue it should really be up with Columbia/Penn in the next group after HYPSM.

But . . . Columbia and Penn are more a classic Ivy mix of some “life of mind” kids but also plenty of pre-professional kids. Chicago is also a mix, but perhaps a notch or two more toward the life of mind kids.

And almost no undergrad engineering either . . . .

For example, while agreeing these are just slices for illustrative purposes, Chicago is #11 on this per capita IB feeder list, whereas Penn is #1 by quite a distance (hello, Wharton), and Columbia is #6 (after also Harvard, Georgetown, Princeton, and Claremont McKenna):

But on this per capita PhD feeders list, holding aside LACs, Chicago is third after Caltech and MIT. Columbia is #14 (again universities only), and Penn doesn’t even make the per capita list (although it is on the totals list):

Of course this only caused Chicago to fall so far, but still, I think this is basically yet another case of the life of mind v. pre-professional thing intersecting with the new ROI/social-mobility era in ways that help the likes of Columbia and Penn, and hurt Chicago.

For the same reasons I think this era helped Claremont McKenna (again #5 on that IB list, #1 among LACs, not ranked on the PhD feeders list), and hurt Carleton (#5 overall on the PhD list, #2 among LACs after Swarthmore, not ranked on the IB feeder list).

2 Likes

I’ll point out the obvious- no college “places” students into IBanking roles (as fascinating as that article is). This phenomenon is misunderstood by HS kids, parents, and the guy at the local bagel store. There are Wharton students who end up-- horrors- at an insurance company because they were rejected by the top 12 places they interviewed at (and didn’t even get an interview at a dozen more).

So it saddens me- as it does for many Talent/Recruiting professionals, to see a bright young person essentially spend the four years of their undergrad on the hamster wheel – as if college is nothing more than a glorified workforce upskilling program to supply an endless stream of I banking analysts. And the kids who are at the “non-Wharton’s” who are grinding away to get into competitive extra-curricular clubs and “banking feeder activities”-- yikes.

Kids- go develop an interest in oceanography and developing better predictive models for hurricanes and tsunamis. You’ll be just as marketable when you graduate (financial institutions love ANYONE out of the sciences with predictive modeling experience) and you’ll have learned something that might actually benefit humanity.

I remember when Swarthmore started to become attractive to the financial institutions sector. Those kids were actually interesting- philosophers and historians and ethicists and all the rest.

3 Likes

Yeah, my wife actually got high up in a major bank eventually (not investment), after first going to her local flagship for one of those derided humanities degrees. She then got a job with a business (non-financial), then went back for a quant-heavy MBA, and then got started as an analyst at that bank.

I am not in business directly, but I deal with a lot of business people, and there are so many C-level execs and such who followed some sort of broadly similar path. For that matter, I know plenty of people from my college who did something along those lines.

So it really surprised me when I started getting serious about college admissions again and found all these people online talking about business careers in a way that was completely alien to my real world experience with all this. Like, going to a good college and getting a good education and maybe doing some networking is all good. But most business careers really develop after college, because of course they do. It is a hands-on, experience-based, people-based sort of field, and young adults fresh out of college really have not typically done more than barely start on all that, if at all.

1 Like

This has been litigated before extensively in other posts. I’ll phrase it a bit differently here. Perhaps there is a perception that some schools are feeders for finance because there is a small minority of very sought after companies where they do almost exclusively recruit from a small subset of schools. So perhaps people over-extrapolate, or just want anything that even slightly improves the odds and keeps their options open. I can tell you first hand (from a company that does this) that this selective recruiting happens. I’m not suggesting it’s common. But it may not have to be for people to pick up on it.

Yup, I get it.

People have a poor understanding of labor markets in general. And you and I can argue until we’re blue in the face that spending 8 years (torturing yourself in HS to take BC Calc in 11th grade so you can load up on Diff Eq DE Calc as a senior) to get into Harvard, to then torture yourself MORE to get a job at DE Shaw… I mean, what a waste of an education.

BUT- I don’t think that absolves people who understand how it works from suggesting (gently, or with gusto) that there are other careers in the world besides I banking. And that people who major in history or poli sci don’t end up painting fences for a living (unless they want to, in which case, bravo and you do you…)

So the matter at hand- aggregated rankings- if it is taking fallacious reasoning and then compounding it by aggregating bad data-- I’m going to push back. Bates is Bates, no matter where the rankings push it. And an EE from Missouri S&T with a 3.9 GPA still rocks compared to an EE from a college nobody ever heard of until it “broke through” the rankings due to a charismatic fundraiser for a president who can’t be bothered trying to get their small and underfunded engineering program accredited.

1 Like

What exacerbates this is that many of these high entry barrier careers are much harder to break into, once you’ve lost the opportunity at the entry level. So, the point of entry at the undergrad level becomes even more critical and of course it further feeds the narrative.

Yep, 100%

We may be talking about two different things here. I completely agree (and myself pointed out) that many of the rankings bias toward high earning careers, and that this is not a good thing, except for the subset actively looking for these roles as their top priority. I also agree that there is more to life than banking, LOL. I myself did not pursue a STEM degree or a finance career, so I certainly appreciate it. None of my kids have thus far pursued finance either. Two of them had double majors that included theater, because they enjoyed it.

But…

My problem with this statement is it implies that kids do this simply because they are chasing a payday at the end, despite it being torturous and at the expense of all else. Perhaps some do, but more do not. It’s possible to seek higher level course because you enjoy them or want the challenge. It’s possible to have an amazing time at Harvard that is independent of wherever you end up in your career. It’s possible to go to Harvard and have just as much fun as going to your state public college. Perhaps you end up in finance, perhaps you don’t. Either way, not a “waste of an education.”

Let’s even play out the student who does this and then tries to get into a top bank and doesn’t succeed. Is that any more of a waste than the elite athlete who worked hard to get into the Olympics and just missed the cut? I’d argue there was nothing wasteful about their pursuit of excellence in their sport, or the many experiences they had along the way. Same for the student who manages to succeed in hard classes and earn their way into a highly selective college they aspired to. Still an amazing experience whether they got the job at Goldman or not.

2 Likes

I agree with you up to a point. Even administrators at Harvard are concerned about the large number of students using Harvard as an on-ramp for the financial institutions industry. I’m not suggesting that they aren’t getting a fine education (if in fact they choose to do that); but I’d be hard pressed to describe gunning for an offer from Goldman as “striving for excellence”. And the students who end up shut out of “highly prestigious financial career” and have to “settle” for significantly less prestigious financial career- that’s where I think the real gap lies. Are you really a loser because you are working in municipal bonds at a regional bank which helps finance bridges, civic improvement projects, other infrastructure for state government? Are you a loser because you are an analyst on the pricing team at the largest maker of vaccines in the world? Or because you are on the strategic planning team at a large manufacturer of heart valves?

Listening to the kids who get shut out of their shiny prestigious options… and end up with these “loser” jobs-- it’s sad. But at least for the ones I know- their mantra is “two years of this, and there’s always law school”. Yet another brass ring to grab! Prestigious law school, prestigious firm, federal clerkship, back to prestigious firm. And so it goes!!!

​​Thank you for sharing your aggregate data. USNWR certainly has the most credibility and influence in academia, and I believe is the only publication which gets peer review responses from the colleges directly. Still, I agree it’s a good idea for folks to check more than one source when evaluating a college. I would encourage people to read through the methodology sections and put more weight on those most aligned with their priorities. As much as the below discussion might imply otherwise, I’m not a big fan of overall rankings. I prefer more granular rankings or ratings as opposed to overall ones as they offer more transparency in how an overall rank was determined and allow the consumer/family more flexibility in devising their own assessments, based on what is more important to them. When practical and available, raw data can be even better. I also think it’s wiser to pick a school based on fit rather than perceived prestige. Fit to me encompasses pretty much everything, from finances to academic calendar, from campus feel to graduation requirements, and much more. When the fit is right, I think a student is more likely to do their best work and walk away with the best education and personal habits, which in the long run is more important than the school name on the diploma.

Still, since Carleton has been singled out as underachieving in your aggregation, I feel compelled to respond with some data that might be of use to those considering it. Allow me first to say that all of the schools you ranked are great, and a student could walk away with a terrific education from any of them. I believe US News considers there to be 211 National LACs, so any placing in the top 25 are already in good company!

I noticed that you indicated that Carleton’s high rank with USNWR was an outlier from the group you chose. Actually, it isn’t– it’s not even the highest of the rankings that have yet to be updated. That distinction goes to Times Higher Education, which last ranked liberal arts colleges in 2021 and placed Carleton 7th. As THE is based in the UK, it made sense for them to partner with the WSJ for marketing purposes. While WSJ now relies on CollegePulse and Statista for the surveying and number crunching, that doesn’t mean THE’s list from 2021 is now the same as WSJ/Statista/CollegePulse. If you are including THE in your aggregate, their most recent rank for Carleton should be 7th.

Also, there are quite a few other ranking sites that were omitted that have Carleton quite a bit higher than the mean or median from your list of LACs. Those include WalletHub (9th), Education Corner (10th), CollegeVine (12th), and CollegeSimply (12th.)

There are other worthwhile sources as well. The oldest college guidebook is probably Fiske. They rate academics on a scale of 1-5. There are eleven 5-star LACs, one of which is Carleton. A more recent book using the same 5-point scale has been published by the group that runs the College Transitions site called “Colleges Worth Your Money.” That too uses a 5-point scale and Carleton is again one of them, along with 15 others that I found. It appears nine LACs received 5 stars for academics from both (Amherst, Barnard, Bowdoin, Carleton, Haverford, Pomona, Swarthmore, Wellesley, and Williams.)

But the guide that appears to have the most positive assessment of Carleton is probably Princeton Review. The college lists in the public section of their site are just based on surveys, but the survey data is combined with other sources to create a book’s worth of content that you can either purchase in print form or access for free from their website after creating an account. They rank colleges on a 99-point scale on seven measures: academics, professors interesting, professors accessibility, admissions selectivity, financial aid, quality of life, and careers. (There are some other ratings for things like green tech/practices on campus and fire safety, but not every school has those scores.) The same ratings are used for universities and LACs, permitting comparisons across school types if one is so inclined. Ratings are accompanied with interesting written descriptions of the schools and a variety of other data. It’s quite a useful site, and, per Inside Higher Ed, Princeton Review is the fourth most-used resource for college selection (fifth if including a guide to online colleges). It doesn’t publish an overall rank, but if one grants themselves the liberty of using their ratings (which seems less dubious to me than averaging different ranks from different publications with widely varying credibility), one can create a rank for themselves. If for example we use a straight sum of the aforementioned seven measures, then Carleton is the #2 LAC, after Williams. If we only look at the Academics rating, it is in a three-way tie for third with Reed and Haverford, following Williams and Middlebury. If one uses academics plus careers and none of the others, Carleton ties Pomona and Middlebury for third, after Williams and Harvey Mudd. I did this relatively quickly by hand so might’ve missed a school, and apologize if I did so, but I believe I checked the usual suspects.

If someone were to try to average across all the sources I’ve mentioned here in addition to those in the original aggregate, I believe Carleton’s median comes to 12th. (Using the midpoint of the number of 5-star academic schools for Fiske, or 6th, and ditto for CWYM, so 8th. Did not use the star ratings for social and QofL from Fiske as 2/3 for stuff not related to academics seemed too large a departure from how everyone else evaluates colleges.) But I think it’s more useful to understand the differences than to just average across them, so one can better decide for themselves how to weight the different takes.

The three longest-running publications looking at undergrad education (USNWR, Fiske, and Princeton Review) all consider Carleton one of the top LACs ~12 LACs, as do at least a half dozen newer ones. Perhaps the outliers are actually Niche, WSJ, Forbes, and Washington Monthly. Is there something common to their approaches? Yes.

Niche ranked Carleton in the top 3 LACs as recently as 2017. I don’t know exactly how their methodology changed since then, but I think it started to include the CollegeScorecard. WSJ, Forbes, and Washington Monthly use College Scorecard too. (Forbes also uses Payscale.)

College Scorecard is a valuable service and I’m glad it exists. The problem is that those using the data, or the rankings that are heavily influenced by that data, aren’t taking the time to understand its limitations.

About a decade ago Brookings ran a sophisticated analysis of the value-added by each college to the earnings of its alumni. (I am deliberately putting aside the persuasive Dale and Kreuger studies on this topic that basically say that once SAT scores, grades, and application choices are controlled for there’s no appreciable income impact from where one goes to college; because earnings data has only grown in influence since then and these matters are hard to settle, I’m exploring the scenario where they are at least partly wrong, which is certainly possible.) When Brookings ran their study using College Scorecard data, Carleton was near the bottom. When they ran their study using PayScale data, Carleton was near the top (5th of all colleges, including universities, as I recall.) If nothing else, this points to the influence of data sources on such research! So how are Payscale and College Scorecard different?

The median earnings reported by College Scorecard uses data collected from students receiving federal funds 10 years after starting college; i.e., what a federal funds recipient is earning about 6 years after graduating. It does not include students who are back in school at that 6 year mark. Payscale is an open survey that anyone can plug their own data into at any point in their career. Neither site accounts for cost of living differences in different cities, states, or regions.

So, College Scorecard is best matched to federal funds recipients who are interested in what their earnings will be 6 years after college and aren’t intending to be in grad school at that time. Payscale is going to be more accurate further out, especially if you go to grad school. Over 3/4 of Carleton alumni go to grad school. If you are considering Carleton and intend to go to grad school and are wondering what you will be making at your career midpoint, Payscale is the better match.

College Scorecard reports a median earnings for Carleton of 70.3K. Again, this is ~6 years after graduation. Payscale reports mid-career earnings for all participating alumni (including those who attend grad school) of 153.7k. They define “mid-career” as 10+ years of experience.

Colleges tend to attract a disproportionate number of students from their own state and region, so it’s significant that there is no control for cost of living when comparing schools. My advice is to compare a given school to others in its region. Carleton’s Payscale mid-career median for “all alumni” is 4th for the entire Midwest region. It’s behind only Notre Dame, Wabash, and Kettering (an engineering school.) It’s slightly ahead of Vanderbilt, Wash U, Northwestern, UChicago, Michigan, UIUC, and Wisconsin, and more significantly ahead of other LACs like Grinnell, Macalester, Kenyon, and St Olaf. Note that the universities listed all offer engineering (though UChicago’s offering is very limited.) I will add that as well as Carleton did in PayScale’s latest salary report for its region, the ROI report is a wildly different story. I found schools with lower mid-career earnings, similar prices and net prices after aid, and greater debt place hundreds of spots higher (e.g. Hamilton), so maybe there’s a bug or something I don’t see. Incidentally, the ROI calculation that Princeton Review performed for their careers measure used PayScale, not College Scorecard. Of the 25 or so top LACs I checked, Carleton had Princeton Review’s 6th highest ROI score.

It’s interesting that Carleton is ahead (barely) of UChicago on Payscale but UChicago is (more significantly) ahead of Carleton on College Scorecard. They are both in the Midwest and both send a ton of students to grad school, so why aren’t their College Scorecard values (70.3k vs. 78.4k) closer? My best guess is this has at least something to do with Carleton’s student body being twice as likely to come from (and probably return to) the Midwest, while UChicago’s student body share from the nation’s most expensive regions (the West and Northeast) is about 60% greater. Dale and Kreuger fans would argue that UChicago’s higher SAT scores (the highest for a non-tech school last I checked) are also a factor.

This post is too long already, but I must squeeze in one last comment regarding Niche’s methodology. Niche weights Acceptance Rate higher than any other factor in their Academics score (which itself is the most heavily weighted in their overall score.) USNWR stopped using acceptance rates in their rank calculations after realizing it’s really a poor measure of anything of value and contributed to some of the practices colleges used to drive up applications. Acceptance rate doesn’t directly tell us about the quality of instruction or outcomes. It doesn’t even directly tell us about the quality of the inputs; grades and/or test scores do that much more directly. Mostly, it tells us about brand awareness in the total applicant pool. When an influential publication rewards a college for rejecting more applicants, it creates a runaway feedback loop: more students getting rejected makes the rank go up, making even more students want to apply next year only to get rejected, and so on. It would be one thing if it were clearly a measure of academic quality, but it’s not. It’s a measure of exclusivity at the expense of wasted hours, fees, and tears. Sadly, a number of top colleges have apparently given in and are no longer even requiring supplement essays, including several of Carleton’s peers like Grinnell, Wesleyan, Middlebury, Hamilton, and Colby. To my surprise Williams joined their ranks this year. The supplement is where a student historically explains why they want to attend a particular school. It’s where they demonstrate they have researched the curriculum and campus life, given some thought to their own priorities, and articulate the fit as they see it. Abandoning the part of the application that is most directly sensitive to fit seems profoundly misguided to me. I would much rather my child go to a college where fellow students fit the school, and they have spent real effort and time evaluating how the school fits them, than to a place that just rejected a greater proportion of students. A student body that understands and intentionally seeks the rigor, curriculum, grad school preparation, close interaction with profs, student involvement with research, humility, Midwestern vibe, immersion in nature, participatory emphasis on physical recreation, and goofy traditions of a place like Carleton is more important to our family than a student body that boasts of exclusivity or overall rank. I think it says a lot about Carleton that they prioritize fit and campus culture more than low admit rate bragging rights or rank in publications that use it.​ I don’t know how long they will be able to resist the supplement-free trend, but I think their doing so is currently one of many positive differences in how they go about things.

6 Likes

That was epic (in the good senses of the word). Thank you for a very interesting and informative read.

2 Likes

… and they use it as the most important ranking factor – which makes their entire ranking rather suspect to me.

For LACs, “the peer assessment response rate … was 28.6%” , and each respondent was asked to only respond to LACs they were “familiar with”. So for a single LAC that means n% of 28.6% - and that subset produces the most important ranking factor .

The question asked is:
“rate the academic quality of peer institutions with which they are familiar on a scale of 1 (marginal) to 5 (distinguished).”

I’m sorry - but even if you are “president, provost or dean of admissions” (their respondents), how much do you objectively know personally about the academic quality of another college/university unless you had worked there in the past 10 or 15 years – or how much of your “knowledge” is really just a feedback loop of public perception, which now gains a clout of authority because of who is spewing it out.

Also, I’m not convinced that university presidents (who spent their time with legislators, architects and major donor circles), or Deans of Admission (= Director of Sales & Marketing) would even be the most authoritative on their own university’s academic excellence.

For one, they might not be academics, nor have ever possibly even taught at their current employer – IF anywhere at all!

I’m not gonna ask the guys in the showroom, about the reliability of their cars (whichever marque they just started selling this year). I’ll talk to the service manager. So, I personally treat the list less as an objective ranking, but a rumor - knowing that some rumors can have some element of truth.

1 Like

So this is a bit freeform, but from that I understand of what those sorts of college administrators do, I agree a lot of how they perceive other colleges, and indeed their own, is just going to reflect the things they hear in their common interactions. That is going to include donors, other such officials at various conferences and such, their own college’s stakeholders including but not limited to faculty, and so on.

So on the one hand, I don’t think that is just public perception. It is perception among those sorts of people, who likely skew better-informed than the general public.

But on the other hand, I completely agree it is not some sort of carefully investigated and rigorously analyzed result on an individual level.

But on the third hand, sometimes (not always) when you do this with a lot of people, the result you get is actually more meaningful than you would get out of each individual. This is a complicated topic but this can be true when the sort of “signal” you are looking for is more consistently in a narrow range, and the “noise” you want to reduce is across a wider distribution which can at least mostly offset. So, one individual, low signal to noise ratio. Many individuals, maybe the signal to noise ratio goes up a lot. And sometimes you can play with it a bit to help further reduce any more systematic noise issues, what might also be called systematic biases.

Of course none of this is being done by these popular magazines in a way that would pass rigorous academic scrutiny. But these are the reasons I am not entirely uninterested in peer reputation data. However, I do try to take it with appropriate caution, including looking for possible systematic biases I might want to correct for.

Unfortunately, the rather low response rate indicates that most don’t bother. Consequently, for many colleges, their sample number could be quite low – possibly too low to be genuinely representative.

(Also makes me wonder what is the “mission” of those few, who actually do respond.)

Worse, no one knows who the actually “filling out” does. I can’t remotely imagine a university president with a million dollar salary to bother, rather passing it down to some staff (if at all) – who won’t have all those contacts/communications/insights that you think goes into the responses.

If it was used as a low weight supplementary item, I could understand. But by (at least substantially) basing their ranking on it, I suspect it’s USN’s way to preserve/satisfy the “natural order” that their readerships expects.

But I tend to be cynical…

This topic was automatically closed 180 days after the last reply. If you’d like to reply, please flag the thread for moderator attention.