Northwestern and USNews trade insults online over college rankings

Northeastern is really the most remarkable university, in my mind, for how it “gamed the ranking.” http://www.bostonmagazine.com/news/article/2014/08/26/how-northeastern-gamed-the-college-rankings/

Back when I was in college, Northeastern was no one’s top choice, a safety school at best. It played the ranking game, according to the article linked here, and now people think it’s at least a respectable school (although some say still quite ugly, I wouldn’t know as I haven’t seen it).

And Tufts Syndrome’s name had to come from somewhere; schools game the system, two guys from Northwestern has now talked openly about it, but who didn’t know? Even the good folks at USNWR must know.

USNWR will continue to be the most used and consulted ranking for the foreseeable future. Period. The only college that is ecstatic with it is Princeton. If prospects don’t want to use it nobody forces them so this is all an exercise in futility and frustration for all the colleges not ranked #1. It is what it is.

There we go with the click bait article on Northeastern, lol. The huge bump they got was when US News changed the 4 yr graduation metric to 6 year - which makes sense for colleges with 5 year programs. Also, the entire college population is now much more focused on internships than ever before (wisely, since they help getting that first job and getting a job right out of college is hugely important to most students because of college loans). This made NEU a big draw. Also, they game the ranking with the international admits (as do many many other schools) since these students “pay the bills”- almost always full pay.

US News should publish the CDS submitted by each college with their ranking. Many colleges are now hiding the CDS from their sites and only showing the stats they want.

@PengsPhils That turns the various magazine rankings upside down!

Prof99, the general public has no influence over the trajectory of a college student’s future. Their opinion is worthless to a college-bound student. I can think of only two opinion polls that would make sense; academe’s (for those who wish to apply to graduate school) and corporate America’s (for those who intend to work after college), although the latter’s should be broken down by industry.

Anything that can be measured and hence turned into a metric, by definition can be gamed. This notion that “output based metrics” cannot be gamed is silly. You create a ranking that gives importance to “Nobel Prize winners” or "National Academy members, or students with most number of “Rhodes and Fulbright scholars”? Seems like a game proof ranking right? Not really. A rich college with enough determination can use its financial resources to lure these profs away from a college that has them. Stanford build up its strengths in various areas by raiding other colleges. Or it can spend enough resources to train the “good students” it can to become extremely competitive in these competitions.

Money drives everything in higher education now. In essence, it really doesn’t matter what your metrics are, a “well endowed” University if it decides to focus on it can improve its numbers fairly quickly.

@Alexandre - both of the opinion groups would be self-serving and biased.

Graduate school academia will overvalue their own experiences, and reinforce their own brilliance. All rankings would have to be weighted against the number of respondents and the potential universe of undergraduates from each institution. It’s easy to know that CalTech is a great school, but there are so many small pockets that would be ignored (and shouldn’t be). How would Olin do? That’s arguably a Top 10 / Top 20 Engineering school the same size as CalTech that would be completely ignored by “those who’ve climbed the academic ladder”. You can probably count the Olin PhD’s on two hands right now, but that doesn’t mean they aren’t the best program going.

Ask hiring managers in Philadelphia about Reed or in LA about Bates…and you (generally) get blanks. The Flagship schools will climb to the top of every chart, as their graduates (with larger numbers) will reinforce the value of their own educations.

Rankings only work because they play on the insecurities of the public who is so worried about making a mistake, instead of going where they feel they belong. All rankings are flawed…but that doesn’t mean we don’t want our own decisions to be validated by them. The favorite rankings listed above (Economist, Forbes, USnews, Washington Monthly, etc.) are the “go-to” justification when friends and family ask about the colleges our children attend. We all use the ranking that puts our children and our decisions near the top of the list.

@pupflier: "You create a ranking that gives importance to “Nobel Prize winners” or “National Academy members, or students with most number of “Rhodes and Fulbright scholars”?”

Well, note that my tiers only measure what alums do, not profs. And yes, colleges could “game” a ranking by turning out very successful grads, but considering that that’s what I consider to be the main goal of undergrad education, how is that a bad thing?

The gaming is bad only if it is misleading (for instance, massaging the numbers or only reporting test scores of a certain percentage of the student body so that the reported average test scores give a misleading impression of the aptitude of the student body overall).

@Much2learn: Note that the biggest component that I use is “American Leaders”, which is a measure of alums who have reached high positions on business, government, the arts, science, media, etc.
Furthermore, you can look up the salaries at College Scorecard but one problem that hasn’t been solved is that salaries can and do differ a lot by occupation, major, and geography (while schools differ in their mix of majors and, obviously, geography). If someone can normalize by those factors, then comparing by salary may be useful, but I haven’t seen that done anywhere yet.

Something like this ?.. https://qz.com/498534/these-25-schools-are-responsible-for-the-greatest-advances-in-science/

The weaknesses of the op-ed, in my opinion:

Too many hypotheticals: They say a school theoretically could assume a $20,000 donation was from 20,000 different people. Or have their top officials call up colleagues at other schools and collude to promote themselves and take down some other school. Or report the SAT math of domestic and international students next to the SAT verbal of domestic only. The authors don’t even say these things have happened, just “what if” they did or “it’s tempting” to do so.

Too much assumption of nefarious intent: Yes, one reason schools may go test optional is to boost reported test scores. But they also may want to actually deemphasize test scores. And yes, that one school may "allegedly’ have sent $10 bills to alumni to increase their alumni participation to raise their ranking. Even if they did, maybe that’s a way to actually raise money and get some alumni who wouldn’t have contributed to send back more than $10. (And are such things really widespread rather than anecdotal? If so, where’s my $10 bill?)

Even some of the things mentioned that really are quite bad don’t seem likely to be common. Have we really heard about numerous selective schools calling up waitlisted students in early April and demanding a commitment before hanging up the phone? To the extent this happens, some unscrupulous schools get a few more high stats or otherwise desirable students, and slightly lower their admit rate. Is this having a big affect on the reliability of rankings?

I do assume that concern for rankings is one major factor (though not the only one) in a lot of what schools do now: encouraging as many applications as possible, emphasizing Early Decision, admitting students for later semesters, etc. And that some schools are doing some questionable manipulations of data, and a few are doing some of the flat-out dishonest things (even the hypothetical ones) suggested here. I just don’t think college officials are commonly doing the worse things mentioned in the op-ed. If someone says “How do I know they’re not?” I don’t. But it’s up to those accusing someone of something to make the case it happened. Providing anecdotes of unnamed or hypothetical schools doesn’t make a great case.

The authors imply that college officials frequently have the scruples of used car salesmen, which manages to be insulting to both college officials and to those dealing used cars. There are certainly plenty of problems with rankings and how they affect what colleges (and students) do. I just think it’s a problem not so much of dishonesty as it is of a warping of emphasis toward things that may not be as important.

@Northwesty I guess I don’t see American Leaders as an effective proxy for salary, just like I would not see schools that produce the most billionaires, or Nobels, as very meaningful. Neither of those is likely to reflect what is happening to the average or typical student. I want to understand what happens to the typical student. Average salaries tell the most about that.

As far as average salaries and graduate school acceptances by school or major, a few top schools provide the data: Harvard, Penn, and MIT. I am sure there are a few others. I know Chicago provides a lot of graduate school information, but I am not sure they provide salaries. At a minimum, separating a few groups like engineering, business, computer science, and nursing would improve the level of information significantly.

This information would be a huge help to students and parents if all schools were required to provide this type of information. I think it should be included in a Monroney Sticker like disclosure so everyone is clear current outcomes. That would be a big improvement. Currently, only education geeks have even a decent understanding of this.

It is ridiculous that students and parents are forced to make such an expensive purchase with almost no outcome information to compare among schools.

Just in Times World University ranking

https://www.timeshighereducation.com/world-university-rankings/2018/world-ranking#!/page/0/length/25/sort_by/rank/sort_order/asc/cols/stats

The ranking of US Universities in the top ten is:

Cal Tech
Stanford
MIT
Harvard
Princeton
UChicago

Penn, Yale JHU and Columbia (in that order) are in the 10-20 group

The problem with this approach is that you are attributing the success of this person, almost exclusively to their 4 years at an undergraduate institute. Let’s say a person who went to Harvard as an undergrad wins a “Nobel prize”. What does that really tell you about Harvard? Do you really want to make an argument that the four years at Harvard are what “made it possible” for this person to with the Nobel prize?

Causation does not equal correlation. It would be very hard to measure how any undergrad education contributed to that success. Attributing all of that success to the university is just silly. Parenting, perseverance and personality have probably more to do with the achievement of a student than association with a university.

Also, a student who is good in physics and wanting to pursue physics, in all probability seeks out schools that are “reputed” to be good in physics. How does a school get a reputation of being good in physics? Well spend “Money” to acquire great profs, labs and use them to do research and publish. In short, whether you get good at something in higher education, is increasingly determined by how much money you are willing to throw at the problem. It is not a sufficient condition, but it is a necessary condition.

So the rich well endowed school that finally landed this student benefited from selection bias. He then goes on to win a Nobel prize. So what? The only way you can give the school the credit is if you can show that “this student” would not have won the Nobel prize, if they had gone to “Podunk state University”

You can fool yourself into believing that “Output based rankings” are “more accurate”. They are just the flip side of the same coin, whose other side is “input based rankings”.

All of these metrics, whether input based or output based are proxies for measuring parental and institutional wealth in some way or the other.

What these rankings really tell me is that “its good to be born rich, and go to a rich school. If you are not born rich, try to get admitted to a rich school”

Given that, there is no need to get all twisted up about any one ranking. You don’t like one, try another. You don’t like any, do your own.

Also recent: the QS University Rankings. In the top 20 in the US: 1.MIT 2. Stanford 3. Harvard 4. CIT 9. Chicago 13. Princeton 14. Cornell 16. Yale 17 Johns Hopkins 18. Columbia 19. Penn

MIT. Stanford, Cal Tech, Harvard, UChicago and Princeton pretty consistently top 6 in US.

I guess Schapiro will be unhappy with the Times ranking as well :slight_smile:

@Chrchill In the latest THE ranking Penn was in the top 10. Last year it was in the 10-20 group.

@Penn95 Yes. congrats. ahead of Yale and Columbia

“I guess Schapiro will be unhappy with the Times ranking as well”

NW is at 20, which seems reasonable for a world ranking. There has to be a graduate component to this, otherwise how are 88 schools ahead of Dartmouth, 49 ahead of Brown?

Saying this is a trash ranking is an insult to trash (if I may borrow from John Cleese in A Fish Called Wanda).

@theloniusmonk PLEASE look at the methodology! The worldwide rankings focus heavily on research output and they do not adjust for the size if the school.