I definitely see these as potential issues, but a few things that suggest maybe they aren’t as harmful—or at least they shouldn’t be as harmful—as is being suggested:
- when we look at who is going where, this doesn’t really apply to 75+% of incoming freshmen. People don’t apply to their local completely average LAC because of a ranking. Or their directional state U. Or even their state flagship. Or their religiously affiliated school. Or the school their friends/family went to. This really only applies to the small proportion of students who happen to be above average students applying to many, many colleges doing a lot of comparisons.
2-I do agree that what US News quantifies is not necessarily perfectly aligned with how a given school would choose to optimize its resources. However, when we consider what broadly makes a school “good”, resources probably do play the biggest role.
3-what is astonishing to me is that as much as schools complain about these rankings, they have done roughly zero to combat them. I’m not saying they should boycott. But people instinctively love to rank things. No one is going to change that. Best colleges, best cities to visit, top 50 basketball players of all time, presidents. These enormously wealthy schools—with all of their scientific and statistical experts—have basically done zero while a tiny shell of a company (US News) has been allowed to corner the market on “the” rankings.
If you want to reduce the importance of those rankings, the easiest way to do that is to create your own. Not just one, but several different rankings using different metrics. Credible rankings that make sense that muddy the waters to eliminate the reliance on any one ranking. Create more sophisticated tools that allow students to personalize rankings for their situation. So much brain, financial and computing power there and we get…nothing.
They could do so, so much. Open vs core curriculum requirements depending upon your field of interest across schools. Walk and bike scores or a series of “pick between two pictures” to identify preferred settings rather than generic urban/rural/suburban designations. They could track graduation rates and outcomes by SES, race and HS GPA/test scores to disclose how students most similar to you fare (grad rate, percent going to grad school, etc). Percent of people taking advantage of internships/research opportunities. Percent doing 1 on 1 work for capstones/honors theses. Income of Pell eligible students x years out. Club sports, dietary restrictions, nearest synagogue, percent of data that are sunny during the school year, faculty turnover, percent of students who study abroad for at least one semester, percent who won prestigious grad scholarships, percent going to med school within x years of graduating. Similarity scores based upon criteria important to a given applicant. Flags that indicate, “if you aren’t so particular about [most restrictive metric] here are a few other very similar schools to your top choices”.
If these rankings are harmful and trash, then the bar is not that high to produce things that would render them obsolete. They have an incumbent advantage, but it’s not that strong. Some of these things would take time/resources to count, but we’re talking about the collective talent and resources of academia here.
I mean, if they really wanted to, they could by the college ranking business line from US News. If it’s causing more collective damage than USNWR’s ability to monetize, that’s the obvious answer. Then let university stats junkies run wild: typology clustering by school and students. Bespoke rankings, etc.