How much do USNews college rankings factor into your decision of school choice?

You also have to assume the USNWR methodology is correct, which requires assuming things like:

*All students attending publics pay out of state tuition
*Merit scholarships and other non-need based aid does not influence college cost
*Average cost among those receiving grant based FA is more important than average net cost among all students
*A high sticker price private college that gives lots of need based aid is a better value than an equal quality lower sticker price college with the same average cost
*All students pay room + board, including those commuting from home
*The USNWR national rankings are an accurate measure of college quality

With these assumptions, it’s no wonder than high HYPSM and other high USNWR nationally ranked, high sticker price colleges without merit aid do very well. The methodology looks like it was designed to put HYPSM on top.

The obvious outlier is Gallaudet. Why Gallaudet does well is less obvious. Gallaudet does not rank especially well in USNWR national, offers merit scholarships instead of just need based, and doesn’t use the high sticker price with large discount approach. Some specific numbers are below, comparing #3 Galludet to #4 Yale, based on numbers listed on USNWR website is below. I also included UCLA, which is ranked #124 as an example of a college that does not rank well in value. As listed below, Gallaudet primarily does well because they have a low average net cost. Had they ranked by average net cost instead of average cost for FA recipients, Gallaudet would likely have been #1.

#3 Gallaudet – Quality = 57, Net cost without merit among those receiving FA = $11k, Quality/Cost = 57/11 = 5.2; Discount = (32-11)/32 = 66%; Need-based grants = 89%

#4 Yale – Quality = 96, Net cost without merit among those receiving FA = $18k, Quality/Cost = 96/18 = 5.3; Discount = (75-18)/75 = 76%; Need-based grants = 53%

#124 UCLA – Quality = 84, Net cost without merit among those receiving FA = $45k, Quality/Cost = 84/45 = 1.9; Discount = (59-45)/59 = 24%; Need-based grants = 13%

2 Likes

I agree with this generally. Where I think families go off the rails is using information resources as definitive ultimate ranking guides to be followed blindly and without question.

Use any resource to gather information to help make a better decision. There are many such resources, of which USNWR is only one competent example.

DO NOT arbitrarily decide that only the schools ranked C-through-H on List Z are the only ones that can provide a great college education/experience for a student.

One interesting tool I found on the USNWR site was the school comparison app that allowed users to add post-college salaries by major into the comparison. It was very useful to discover that similar type schools (say, mid-sized colleges between 3K-9K enrollment) often had very similar salary outcomes for students in specific majors, regardless whether they were ranked 30, or 65, or 100, or 150. In fact, many had very similar statistics (graduation rate, etc) across many factors important to choosing a university. The only metric that appeared to be directly tied to being overall ranked in the top 50 or outside the top 80 appeared to be admission rate. Make of that what you will, as I have very specific thoughts about that subject of acceptance rates.

Personally, I feel the absence of overall ranking is what helps resources like Fiske, CTCL, and Princeton Review be very useful in the process. What should be most important to families is access to the statistics. Any actual overall ranking methodology can and very likely is flawed and inappropriate to apply to any individual’s college search situation.

5 Likes

However, that isn’t their ranking, but the information they share. Similarly, Niche provides interesting insights by the students on things like the general atmosphere, food, student culture, etc. However, Niche has some of the most ridiculous rankings of “best” that I have ever seen - I remember that they ranked MIT as the “best for veterans”, even though, in their entire UG body of around 4,500, they only have 13 veterans, Harvard, which they ranked #2, did “better”: 13 veterans in the class of 2024, up from 6 in the class of 2023.

USNews rankings consistently underrates public universities, and overrates universities which become “popular”. They don’t do as bad as Niche, which I think is contractually obliged to fit Harvard, Yale, and Princeton into their top 10 universities, no matter what they are ranking. Still, a large portion of USNews’ ranking methodology is little more than proxies for the income of the students who attend the college.

However, the info that they provide on the college tends to be solid and easily accessible.

What college info sources should do, but do not, is measure the reputation of a college in the area and nationally. Not among university administrators or Financial “elite”, but among employers outside of NYC and LA who actually employ 99% of the population. Few in Cincinnati are going to employ a Harvard grad over a OSU grad, few in Peoria will hire a Yale grad over a UIUC or NU grad, and few in Tuscaloosa will choose a Princeton grad over an Alabama grad.

5 Likes

I think most of you are lying
 Lol :lying_face::joy:


At least for my daughter we were looking at CTCL and looking for small lacs. Rank wasn’t important, fit and culture were.

For my son, Yep, Right to the rankings. It help us cultivate our massive spreadsheet of 38 top engineering programs and wouldn’t he be great at all of them
 Lol.

His list was 5. In the end he ended up with 4 acceptances from his 5. (We still made him apply to like 12 schools :thinking:). If only I knew then what I sometimes know now
 Hmmm


Anyway, I do think the rankings were useful actually to give us the names of program to investigate. But I made him also research and pick colleges from T10, 11-20 and so on to 50. He had to research and apply to at least one in each group. I have said this before but there are great colleges to T100 and beyond. This exercise really opened up my knowledge of the plethora of great universities /colleges out there.

We used Niche a bit but that’s predictable but gained some insight.

4 Likes

What are your thoughts on acceptance rates?

Acceptance rates are poor proxies for determining the educational quality (or any metric of quality) of any university.

We can assume that the vast majority of applicants to any university are qualified for admission to that specific university. It doesn’t matter whether we look at GPA or test scores or whatever, most students who apply generally meet the qualifications.

American U
SAT 1210-1390, 36% accept rate, #76
applicants:spots = 18545 : 2131 = 8.7

Chapman U
1190-1380, 56%, #124
applicants:spots = 14273:1876 = 7.6

Wake Forest
1260-1440, 28%, #28
applicants:spots = 13071 : 1275 = 10.2

SMU
1300-1480, 47%, #66
applicants:spots = 13959 : 1677 = 8.3

Gonzaga
1200-1360, 62%, #80
applicants:spots = 9279 : 1332 = 6.9

Fordham
1240-1450, 46%, #66
applicants:spots = 47930 : 2442 = 19.6

Pepperdine
1230-1450, 32%, #49
applicants:spots = 12764 : 896 = 14.2

St Louis University
1170-1380, 58%, #103
applicants:spots = 15573 : 1804 = 8.6

These schools are more or less the same, by these stats, but are ranked from 28 to 124. The greatest difference in mid-50 lower SAT (1170 to 1300) is not between the highest and lowest ranked schools.

I think the schools with the higher ratios of applicants:spots combined with accept rates drives higher rankings, and higher rankings drive more applications which drive down the accept rates, which increases perception of desireability, which fools some ranking organizations to award more weight for “prestige” which drives the ranking up, which 
 feeds the system.

Once we add in other metrics (such as pay ranges of graduates by majors) I found that there was little difference, if any, between a school ranked in the high 20s and one in the 150s. Except for the ranking and perceived prestige.

I chose all private schools because when applying the other metrics, it helps me feel that the SES status of the students would be very similar and thus wouldn’t affect the results in ways I’d need to account for.

If Chapman is an easier admit than WFU, costs approximately the same, and delivers more or less the same results, should a family hold one in lower esteem because a magazine ranked one nearly 100 spots higher than the other?

And that doesn’t even begin to get into the more personal questions of intended majors, greek importance, location, etc etc etc.

Like most, I entered the process focusing mostly on rankings (well, aside from cost.) Part way through the process, I stopped caring so much about rankings because it seems they were driven by illogical reasons. It felt like a Moneyball moment, realizing there was great value in following the statistics and ignoring the accepted reasoning.

NOTE: Gonzaga looks like a hidden resource of an outlier when comparing ranking to accessibility.

@MorseLewis , I realize I didn’t exactly get too deep into your specific question about my thoughts specifically on admit rates, but this is sort of an lateral entry into my thoughts on the matter. For me, it’s all wrapped up in a bundle. Sorry i might not have gotten as specific as you might have wished.

8 Likes

Note for above: It will be very interesting to see how Elon climbs these rankings. It was first ranked last year. Unsurprisingly, because of the enhanced publicity of a top 100 ranking, Elon then received twice as many applications as the year before. In a year or two, Elon’s official acceptance rate is sure to drop, which will probably result in Elon rising in the national ranking. That in turn will lead to more applicants, which will lead to a lower accept rate. Same school, higher ranking, enhanced “prestige.”

With two more children in the pipeline, I’ll be watching and it will be interesting to see how it shakes out. I liked Elon before it cracked the USNWR rankings, but watching it enter the national discussion in this way is 
 something.

IMO College rankings tend to be extremely political. I tend to not use them.

2 Likes

The metrics USNews employ are all reported by the schools in their common data sets, so the rankings are useful in that they consolidate this data into one resource comparing schools in specific categories so that you can see the differences. As college costs have increased, USNews has provided a valuable resource in helping first generation students make very important decisions with financial implications.

Yes, I found it quite useful for statistic purposes only. Check SAT, GPA, cost etc. I will never use the rating as far as the quality of the school in my child’s college journey. I will research the school profile, history etc. and come to my own conclusion based on the findings. It was my assumption the OP was asking how much weight is put on the rating. I will not discount a college because USNews said they are number 45 vs 5.

The highest weighted single component of USNWR ranking is a questionnaire USNWR sends to "academics’ asking them to rate colleges on a scale of 5 = “distinguished” to 1 = “marginal.” This questionnaire is 100% of ranking on some of the lower traffic subslists. The questionnaire is one of several USNWR criteria that are not part of the CDS.

More importantly, the USNWR weightings and ranking are completely arbitrary and are not a good reflection of college quality or much else. Many metrics are well correlated with endowment, so the overall ranking tends to follow endowment per student.

I agree that the USNWR can be a good resource to look up information about specific colleges, like one could look up information in the CDS. However, much of the USNWR content is not free, and there are many free alternative resources that provide more useful information.

5 Likes

They’re administrators, not really academics. The response rate from this group isn’t great either. Those chose to respond probably because they wanted (or needed) to stay on the good side of USNWR


1 Like

I always felt US News rankings became popular because they worked backwards. What criteria should we use within a category to come up with a logical and believable college ranking?

Sadly, when I see new lists with IMO better criteria being used, and the “wrong schools” coming up - I disregard.

1 Like

0% seems like they are using metrics we don’t consider to be useful.

However, it would be mathematically difficult to keep an original group of schools together if any were to change substantively in aspects considered by the original methodology. Also, across categories for which the same methodology has been established, it wouldn’t be feasible to use a “backwards” approach. For example, USN could “assign” the top position in the NU category to Princeton, but Williams would then have to “earn” the top spot in the NLAC category.

I think they took a big hit when the parent company stopped publishing a regular news magazine. Forty years ago you could walk into any dentists office and thumb through it for free. And, since it cost nothing, it was easy to overlook the fact that they were essentially re-jiggering the same statistics over and over again every year.

i don’t understand.

So true that they work backwards to justify their list. In 2000, they used new criteria mostly based on academics and Caltech was ranked number 1. The algorithm was promptly altered to put Harvard, Yale, and Princeton back on top the following year.

7 Likes

Pretty much what @merc81 states. Sums up how comparison doesn’t work.

I don’t think it was a change in focus on academics. Instead USNWR "brought [its] methodology into line with standard statistical procedure,” which involved standardizing the subcomponent ranking variables, like is common in studies. The resulting ranking change was as follows:

Top 5 in 1999 (before statistical standardization)
1 . Harvard / Yale / Princeton (tie)
4. Stanford / MIT (tie)


9. Caltech

Top 5 in 2000 (after statistical standardization)
1 . Caltech
2. Harvard
3. MIT
4. Princeton/Yale (tie)

Caltech suddenly jumping 8 places to the #1 spot left many readers confused and caused internal strife at USNWR. After a change in USNWR college ranking internal leadership, the rankings were modified to “apply a logarithmic adjuster to all spending values,” which was clearly targeted to hurt Caltech whose spending per student was highest by a good margin. Next year, the rankings changed to the following. Note that the 2001 rankings are very similar to the ranking today, 20 years later. The main difference is Caltech swapped places with Columbia.

Top 5 in 2001
1 . Princeton
2. Harvard/Yale (tie)
4. Caltech
5. MIT


10. Columbia

Top 5 in 2021
1 . Princeton
2. Harvard
3. Columbia
4. Yale / MIT (tie)


9. Caltech