Thank you for sharing your aggregate data. USNWR certainly has the most credibility and influence in academia, and I believe is the only publication which gets peer review responses from the colleges directly. Still, I agree it’s a good idea for folks to check more than one source when evaluating a college. I would encourage people to read through the methodology sections and put more weight on those most aligned with their priorities. As much as the below discussion might imply otherwise, I’m not a big fan of overall rankings. I prefer more granular rankings or ratings as opposed to overall ones as they offer more transparency in how an overall rank was determined and allow the consumer/family more flexibility in devising their own assessments, based on what is more important to them. When practical and available, raw data can be even better. I also think it’s wiser to pick a school based on fit rather than perceived prestige. Fit to me encompasses pretty much everything, from finances to academic calendar, from campus feel to graduation requirements, and much more. When the fit is right, I think a student is more likely to do their best work and walk away with the best education and personal habits, which in the long run is more important than the school name on the diploma.
Still, since Carleton has been singled out as underachieving in your aggregation, I feel compelled to respond with some data that might be of use to those considering it. Allow me first to say that all of the schools you ranked are great, and a student could walk away with a terrific education from any of them. I believe US News considers there to be 211 National LACs, so any placing in the top 25 are already in good company!
I noticed that you indicated that Carleton’s high rank with USNWR was an outlier from the group you chose. Actually, it isn’t– it’s not even the highest of the rankings that have yet to be updated. That distinction goes to Times Higher Education, which last ranked liberal arts colleges in 2021 and placed Carleton 7th. As THE is based in the UK, it made sense for them to partner with the WSJ for marketing purposes. While WSJ now relies on CollegePulse and Statista for the surveying and number crunching, that doesn’t mean THE’s list from 2021 is now the same as WSJ/Statista/CollegePulse. If you are including THE in your aggregate, their most recent rank for Carleton should be 7th.
Also, there are quite a few other ranking sites that were omitted that have Carleton quite a bit higher than the mean or median from your list of LACs. Those include WalletHub (9th), Education Corner (10th), CollegeVine (12th), and CollegeSimply (12th.)
There are other worthwhile sources as well. The oldest college guidebook is probably Fiske. They rate academics on a scale of 1-5. There are eleven 5-star LACs, one of which is Carleton. A more recent book using the same 5-point scale has been published by the group that runs the College Transitions site called “Colleges Worth Your Money.” That too uses a 5-point scale and Carleton is again one of them, along with 15 others that I found. It appears nine LACs received 5 stars for academics from both (Amherst, Barnard, Bowdoin, Carleton, Haverford, Pomona, Swarthmore, Wellesley, and Williams.)
But the guide that appears to have the most positive assessment of Carleton is probably Princeton Review. The college lists in the public section of their site are just based on surveys, but the survey data is combined with other sources to create a book’s worth of content that you can either purchase in print form or access for free from their website after creating an account. They rank colleges on a 99-point scale on seven measures: academics, professors interesting, professors accessibility, admissions selectivity, financial aid, quality of life, and careers. (There are some other ratings for things like green tech/practices on campus and fire safety, but not every school has those scores.) The same ratings are used for universities and LACs, permitting comparisons across school types if one is so inclined. Ratings are accompanied with interesting written descriptions of the schools and a variety of other data. It’s quite a useful site, and, per Inside Higher Ed, Princeton Review is the fourth most-used resource for college selection (fifth if including a guide to online colleges). It doesn’t publish an overall rank, but if one grants themselves the liberty of using their ratings (which seems less dubious to me than averaging different ranks from different publications with widely varying credibility), one can create a rank for themselves. If for example we use a straight sum of the aforementioned seven measures, then Carleton is the #2 LAC, after Williams. If we only look at the Academics rating, it is in a three-way tie for third with Reed and Haverford, following Williams and Middlebury. If one uses academics plus careers and none of the others, Carleton ties Pomona and Middlebury for third, after Williams and Harvey Mudd. I did this relatively quickly by hand so might’ve missed a school, and apologize if I did so, but I believe I checked the usual suspects.
If someone were to try to average across all the sources I’ve mentioned here in addition to those in the original aggregate, I believe Carleton’s median comes to 12th. (Using the midpoint of the number of 5-star academic schools for Fiske, or 6th, and ditto for CWYM, so 8th. Did not use the star ratings for social and QofL from Fiske as 2/3 for stuff not related to academics seemed too large a departure from how everyone else evaluates colleges.) But I think it’s more useful to understand the differences than to just average across them, so one can better decide for themselves how to weight the different takes.
The three longest-running publications looking at undergrad education (USNWR, Fiske, and Princeton Review) all consider Carleton one of the top LACs ~12 LACs, as do at least a half dozen newer ones. Perhaps the outliers are actually Niche, WSJ, Forbes, and Washington Monthly. Is there something common to their approaches? Yes.
Niche ranked Carleton in the top 3 LACs as recently as 2017. I don’t know exactly how their methodology changed since then, but I think it started to include the CollegeScorecard. WSJ, Forbes, and Washington Monthly use College Scorecard too. (Forbes also uses Payscale.)
College Scorecard is a valuable service and I’m glad it exists. The problem is that those using the data, or the rankings that are heavily influenced by that data, aren’t taking the time to understand its limitations.
About a decade ago Brookings ran a sophisticated analysis of the value-added by each college to the earnings of its alumni. (I am deliberately putting aside the persuasive Dale and Kreuger studies on this topic that basically say that once SAT scores, grades, and application choices are controlled for there’s no appreciable income impact from where one goes to college; because earnings data has only grown in influence since then and these matters are hard to settle, I’m exploring the scenario where they are at least partly wrong, which is certainly possible.) When Brookings ran their study using College Scorecard data, Carleton was near the bottom. When they ran their study using PayScale data, Carleton was near the top (5th of all colleges, including universities, as I recall.) If nothing else, this points to the influence of data sources on such research! So how are Payscale and College Scorecard different?
The median earnings reported by College Scorecard uses data collected from students receiving federal funds 10 years after starting college; i.e., what a federal funds recipient is earning about 6 years after graduating. It does not include students who are back in school at that 6 year mark. Payscale is an open survey that anyone can plug their own data into at any point in their career. Neither site accounts for cost of living differences in different cities, states, or regions.
So, College Scorecard is best matched to federal funds recipients who are interested in what their earnings will be 6 years after college and aren’t intending to be in grad school at that time. Payscale is going to be more accurate further out, especially if you go to grad school. Over 3/4 of Carleton alumni go to grad school. If you are considering Carleton and intend to go to grad school and are wondering what you will be making at your career midpoint, Payscale is the better match.
College Scorecard reports a median earnings for Carleton of 70.3K. Again, this is ~6 years after graduation. Payscale reports mid-career earnings for all participating alumni (including those who attend grad school) of 153.7k. They define “mid-career” as 10+ years of experience.
Colleges tend to attract a disproportionate number of students from their own state and region, so it’s significant that there is no control for cost of living when comparing schools. My advice is to compare a given school to others in its region. Carleton’s Payscale mid-career median for “all alumni” is 4th for the entire Midwest region. It’s behind only Notre Dame, Wabash, and Kettering (an engineering school.) It’s slightly ahead of Vanderbilt, Wash U, Northwestern, UChicago, Michigan, UIUC, and Wisconsin, and more significantly ahead of other LACs like Grinnell, Macalester, Kenyon, and St Olaf. Note that the universities listed all offer engineering (though UChicago’s offering is very limited.) I will add that as well as Carleton did in PayScale’s latest salary report for its region, the ROI report is a wildly different story. I found schools with lower mid-career earnings, similar prices and net prices after aid, and greater debt place hundreds of spots higher (e.g. Hamilton), so maybe there’s a bug or something I don’t see. Incidentally, the ROI calculation that Princeton Review performed for their careers measure used PayScale, not College Scorecard. Of the 25 or so top LACs I checked, Carleton had Princeton Review’s 6th highest ROI score.
It’s interesting that Carleton is ahead (barely) of UChicago on Payscale but UChicago is (more significantly) ahead of Carleton on College Scorecard. They are both in the Midwest and both send a ton of students to grad school, so why aren’t their College Scorecard values (70.3k vs. 78.4k) closer? My best guess is this has at least something to do with Carleton’s student body being twice as likely to come from (and probably return to) the Midwest, while UChicago’s student body share from the nation’s most expensive regions (the West and Northeast) is about 60% greater. Dale and Kreuger fans would argue that UChicago’s higher SAT scores (the highest for a non-tech school last I checked) are also a factor.
This post is too long already, but I must squeeze in one last comment regarding Niche’s methodology. Niche weights Acceptance Rate higher than any other factor in their Academics score (which itself is the most heavily weighted in their overall score.) USNWR stopped using acceptance rates in their rank calculations after realizing it’s really a poor measure of anything of value and contributed to some of the practices colleges used to drive up applications. Acceptance rate doesn’t directly tell us about the quality of instruction or outcomes. It doesn’t even directly tell us about the quality of the inputs; grades and/or test scores do that much more directly. Mostly, it tells us about brand awareness in the total applicant pool. When an influential publication rewards a college for rejecting more applicants, it creates a runaway feedback loop: more students getting rejected makes the rank go up, making even more students want to apply next year only to get rejected, and so on. It would be one thing if it were clearly a measure of academic quality, but it’s not. It’s a measure of exclusivity at the expense of wasted hours, fees, and tears. Sadly, a number of top colleges have apparently given in and are no longer even requiring supplement essays, including several of Carleton’s peers like Grinnell, Wesleyan, Middlebury, Hamilton, and Colby. To my surprise Williams joined their ranks this year. The supplement is where a student historically explains why they want to attend a particular school. It’s where they demonstrate they have researched the curriculum and campus life, given some thought to their own priorities, and articulate the fit as they see it. Abandoning the part of the application that is most directly sensitive to fit seems profoundly misguided to me. I would much rather my child go to a college where fellow students fit the school, and they have spent real effort and time evaluating how the school fits them, than to a place that just rejected a greater proportion of students. A student body that understands and intentionally seeks the rigor, curriculum, grad school preparation, close interaction with profs, student involvement with research, humility, Midwestern vibe, immersion in nature, participatory emphasis on physical recreation, and goofy traditions of a place like Carleton is more important to our family than a student body that boasts of exclusivity or overall rank. I think it says a lot about Carleton that they prioritize fit and campus culture more than low admit rate bragging rights or rank in publications that use it. I don’t know how long they will be able to resist the supplement-free trend, but I think their doing so is currently one of many positive differences in how they go about things.