I don’t really visit these sites to know, but I’m sure you’re right. It sounds to me that their sponsors didn’t put much effort into those functionalities, probably because they don’t want to highlight them relative to their much more widely read default rankings (where the money is).
I think part of the reason is because NJ has a larger proportion of stronger students than most other states and many of them have options (both private and public) outside their state. If other students see the stronger students leave the state, they think they should too. Rutgers is actually a much stronger school than the impression it leaves many students.
But there’s also little question that Rutgers doesn’t have the reputation of UVa, UMich, Chapel Hill, among others, despite being in a state with a roughly comparable population size. I suspect if Rutgers had the rep of those schools more would stay. Given Rutgers was one of the nine original Colonial Colleges, it can’t be attributed to legacy. It can’t be attributed to state revenue either. NJ collects far more tax revenue per capita than those other states. NJ has the students and the money to support world renowned public university system.
Conducting a complete data collection/validation/ranking process for one school, in the middle of developing next year’s ratings, that will be superseded in two months, makes no sense.
The vitriol attached to USNews by many is amusing. Sounds like ‘sour grapes’ by those whose favorites aren’t as high as they’d like. Transparency and shining a light on academics is a good thing and the public needs more of it. Cheating should not be condoned just because you don’t like a survey. It is good these services exist, otherwise kids and parents would be even more in the dark. One can view the methodology and scoring system to determine whether a particular service is better than another for their family’s situation. Colleges are very expensive to attend and need more scrutiny and differentiation, not less.
And I don’t see why people are shifting the blame from Columbia (allegations at present) to US News. If Columbia’s (or its students’) quality is not as good as they have been claiming, the public should know that, especially given that they don’t release CDS (probably more reliable since it is submitted to the government?) like nearly every other institution.
Common Data Set does not appear to be a government related thing, although some of the information overlaps with data given to the government and included in IPEDS and College Navigator.
Then I think US News should just use audited documents of some kind if possible. Clearly, the honor system has broken down here.
What I find remarkable is that the institutions that were ranked in the top 10 back in the first edition of the US News college rankings (1983) which still hold the top 10 are HYPS, MIT, Duke, and Chicago. Doesn’t this argue that these institutions stand the test of time and exhibit robustness?
There is an alternative explanation. For example, the article at For example, one review found the following.
“Sources within U.S. News claim that, after looking deeply into the methodology of the rankings, Graham found that U.S. News had essentially put its thumb on the scale to make sure that Harvard, Yale, and Princeton continued to come out on top,”
The question to ask is if some large subset of HYPSM doesn’t show up in the top 5, would the readers think the ranking is credible? So they are essentially calibrating the model to where they think survey data is likely to end up for the first several rankings.
Scientists at USNWR have determined that the correct formula for “best college” is 17.6% 6-year graduation rate, 10% financial resources per student, 8% class size, 7% faculty salary, … , and this formula is used in USNWR’s ranking. With an accurate formula, USNWR’s ranking accurately reflects the “best college” for readers of the website/magazine.
The weightings and criteria in the USNWR “best college” formula are arbitrary. One consideration in their selection is making reader-expected top college names appear near the top of ther rankings. If the reader-expected top colleges did not appear near the top of the rankings, it would make many readers believe the USNWR rankings methodology is not accurate, decreasing the portion that purchase College Compass or other sales, and ultimately hurting USNWR’s net income.
What data that colleges consistently provide is audited by a neutral third party? Even the CDS is not. If USNWR did so it would cost them more than they want to spend on the process. Possibly more than can be justified from the revenue they get for the project.
USNews was the first of these “services” to peg its results to each college’s per capita material wealth as calculated using various blunt formula and became an immediate draw for people who find that sort of thing fascinating. The fact that the results have proven very little over the long run (e.g. none of the T50 colleges or universities it’s been surveying for 30 years have ever been more popular) and are so easily manipulated doesn’t seem to diminish its sway in certain quarters.
Fwiw, Common Data Set has nothing to do with “the government”
is a collaborative effort among data providers in the higher education community … Common Data Set items undergo broad review by the CDS Advisory Board as well as by data providers representing secondary schools and two- and four-year colleges. Feedback from those who utilize the CDS also is considered throughout the annual review process.
I would say “their preferred” formula - I doubt they would claim it as any kind of best/ideal/perfect.
With “a consistent/defined” formula, the ranking represent “the results of these criteria and defined process” to their readers.
There are at least a half dozen other rankings that may have not been around as long as USN, but provide additional valuable data points. Users should collect as much data as possible, understand what they represent, and use them as appropriate.
I don’t blame USNews for being first/successful. But I realize that some similarly hate Amazon for being first/successful. They also solely exist to do what they do.
Are they really using “consistent” formulae? Don’t they “tweak” these formulae all the time to match their (or their readers’) expectations? Didn’t USNWR “tweak” its formula a number of years ago so Caltech wouldn’t come out at the top?
USNWR claims their rankings measure “best” college. There is no validation or verification that their rankings actually measure this. Instead it’s just applying arbitrarily selected weightings to a bunch of often poorly selected CDS-type numbers, usually favoring the type of metrics that high endowment per student type colleges do well in and avoiding/minimizing the criteria that they do not well in. It’s essentially “best” at being similar to a high endowment per student colleges that spend a lot and admit students who are likely to graduate.
The rankings weightings change most years. They are not consistent. Although the changes in weightings are rarely enough to cause large changes in rankings… more enough to give readers the impression that there is reason to pay for the updated rankings and see what’s new.
There are a few exceptions to these generalizations. For example, in one year, USNWR changed their statistical calculations to use normalized statistical variables, as should be standard. However, this caused larger changes in rankings for some colleges, including Caltech shooting up to from #9 to #1,… In the following year USNWR added a “logarithmic adjuster” in the category for which Caltech was most ahead of HYP, and there was talk of some non-voluntary changes in staff. HYP returned to the top 3, ahead of Caltech.
According to the page at History of College Rankings - CollegeRank.net , college rankings date back at least to 1900… more than 80 years before USNWR. However, my issue is not with who was first in making college rankings. My issue is instead with the rankings being a formula with arbitrary selection of numbers that produce near meaningless results.
I do not have the same problem with Amazon. Amazon produces a valuable service that improves many lives.