“Bad data” suggests an earnest “mistake” and lack of intent which should result in a degree of randomness to the result of the error.
Having read Prof Thaddeus’s report it appears that all questionable methodologies served to improve (not lower or have no effect) upon Columbia’s rating. This consistent pattern seems damning.
Very coincidental and highly improbable but if so why hasn’t Columbia fully explained how and why and this happened over the course of years and who specifically was responsible. At a minimum some individual should be named and held accountable unless of course that would reveal intent.
It is extremely unlikely Columbia accidentally misinterpreted the questions for the data they submitted. If you haven’t read the Columbia professor’s detailed breakdown of where the data is misrepresented you may want to take a look. This was a concerted effort to systemically misrepresent data in at least half a dozen categories in material ways. They attributed 100% of patient care costs as “student instruction” costs despite already having a federal report of their student instruction costs that didn’t do that and was over a billion dollars lower. Meaning they had to proactively revise their data bespoke to the USNWR submission to distort the result.
Yeah, not to pile on — I think the point is now made — but no one should have the “innocent mistakes” misapprehension about Columbia. As I mentioned on another thread, the lie about patient-care costs being instructional costs was a $2.1 billion difference and meant that Columbia’s per-student spending was higher than those of Harvard, Yale and Princeton — combined. It was flat-out fraud.
At best, USNWR and other rankings can be a useful shorthand, but people do take them way too seriously. When I was heading to college in the late ’80s from NYC and told people I was going to Northwestern, a frequent response was “That co-op school in Boston?” Northwestern was then ranking in the high teens or low 20s on the relatively new USNWR chart (so new that USNWR was then still a weekly newsmagazine), and it was attracting students from all over but was still heavily Midwestern. It’s now perennially around 9 or 10, and that reflects some real efforts in fundraising, faculty quality and facilities. But also, a difference of 10 spots is functionally meaningless in a universe of 4,000 colleges, or even a galaxy of 150 top colleges.
Georgetown still ranks in the 20s, and I joke about that with my wife sometimes, but is Northwestern “better” in some absolute way than Georgetown? Of course not. After decades of its own improvements, the same goes for that co-op school in Boston, now an internationally coveted choice. My D19 is getting an excellent education at the New School, which ranks somewhere in the 130s or 150s or something. It’s common but true advice: Focus on programs, outcomes and fit, and let the rankings serve as directional nudges at most.
Agree that most students do not attend private colleges but these public universities use the UN News rankings in their marketing, so you’d have to assume they matter to them and think students use them. Berkeley’s engineering home page as this:
“17: Number of Berkeley Engineering undergraduate and graduate programs ranked in the Top 5 nationally by U.S. News & World Report.”
Here’s a press release on Ohio state’s engineering program:
When universities are doing press releases talking about US News rankings, you have to believe they think it’s important to their applicants and are being used to decide where to apply, for sure. I think where to attend is based on other factors like affordability.
As rankings have grown, so has their use for measurement and reward for admissions officers, senior administrators, and University Presidents. People submit data, and most people get paid (either by their current employer or by the next) for “proof” of their abilities. It’s a game, and some look for shortcuts.
The trick is to move before the measurement and reconciliations are done.
In states with many public universities, most undergraduate students do not attend the flagship that may be ranked where people pay attention to the ranking. For example, in California, far more undergraduate students attend CSUs than UCs, and far more attend community colleges than both put together.
Columbia and a handful of other selective schools also do not release a Common Data Set (although it seems perhaps Columbia might in response to this?). The Common Data Set is one way to be transparent with information and put that data into the hands of “consumers” of education.
Schools can make their excuses about having other metrics or other ways of informing people, but you can be sure that there is something in the CDS that is less than flattering for the schools that do not release this information directly.
Of course, some information in the CDS has its own issues, such as the various ways colleges calculate the HS GPA of their frosh that make that part less useful for comparison. Hence many people’s over reliance on test scores to compare admission selectivity.
Don’t mean to disparage you specifically, but I find this anecdote about Northwestern hard to believe. I personally didn’t attend Northwestern, though I have relatives who did, I went to college around the same time period you did (mid-80s), and even then Northwestern was a highly coveted, “prestigious” brand name. Had never even heard of Northeastern in Boston until way later in my professional life.
It depends on whether Columbia (etc.) produces the asked-for information, and the ranking company trusts and/or verifies the accuracy of that information.
They’d better hurry up with that missing/erroneous data. Can’t imagine too many would be looking for an Ivy University proxy in the female-only, LAC section of the rankings.
This reminds me of the Dean at UIUC’s MBA program asking us to give favorable reviews of the program if contacted by USNWR. He then outlined how schools play fast and loose with submissions to USNWR for both grad and undergrad programs. According to him, some schools included foreign student data in their submissions for Math SATs, but they omitted foreign students in the English SAT averages. Supposedly, some schools were pretty liberal with their employment data, including graduates with jobs that had nothing to do with their majors - read retail and bartending. According to the Dean, such manipulation of data was commonplace and well known at the time.
There has been no indication that Columbia et al will not receive numerical rankings in September. U.S. News remains free to rank schools without their participation using publicly available information.
That’s fair — anecdotes aren’t comprehensive data. For whatever reason, Northwestern had a relatively low profile in the circles I was in. I went to a specialized high school that was highly competitive and heavily Ivy-focused, and while a few other people applied to Northwestern, it was by no means a common or popular choice. While people didn’t disparage it once they understood what I was talking about, at least a couple did mix it up with Northeastern, as I recall. I myself had never really heard of it until I started my college search process.