WSJ College Rankings

Yep - and I think this is @billythegoldfish who I love his posts and loved living his process. He was worried his daughter wasn’t taking the highest rank but got comfortable with UGA at 49 - but if someone is to say it’s - and I don’t know the # - 100 or 200 - it’s like crap - well, we still have US News.

It’s natural.

Solution - send your kids to colleges not ranked high to begin with :slight_smile: OK - they chose them - but in this case, it works.

I remember seeing my daughter’s college the other year - was like 450th or 500th in WSJ. Wasn’t sure I could name that many colleges.

2 Likes

If people don’t like the WSJ rankings, perhaps you prefer the ranking from Forbes -

1 Like

I like em all :stuck_out_tongue_winking_eye:

SDSU ahead of Bowdoin, Cal Tech. And many UCs. Hmmmm. Tufts 55 and CMU 59?

Where the outrage :slight_smile: NEU 85, Wake 86. The horror.

Honestly - one underdog who is very consistent amongst all three surveys so perhaps is truly legit - UF.

While it was stunning to see them right there with perceived US News top tier publics - they appear to be top 30 in all three surveys.

While I don’t agree with some placements on this list it generally aligns with generally accepted reality as I see it. It’s not the hot mess of nonsense that the WSJ list represents. Of course that’s just my opinion.

Perhaps you missed my previous note.

My problem with UF is their very large class size and many undergrad classes being taught online. This should be considered in the rankings. That school runs like a factory

2 Likes

When you say factory; do you mean efficient? :grin:

My son felt the same way, big and impersonal

I could do the same on this one.

“Rutgers over Middlebury?”

“Florida State over Haverford?”

“San Diego State over Bowdoin?”

Funny thing is, this thread, which seems to be pointing to the WSJ ranking as the worst of all-time, is not that different from the other threads on rankings.

People find a school that they believe inherently is better than another school, see the inverse ranking, and cry out loud, “this can’t be!”

I mean, if you know in your heart that Dartmouth is a better school than, say, University of Florida, why ever look at a ranking? Just know it and be on your way.

If people complaining had to take truth serum and reveal what they’re actually thinking, it’s this:

“I have an idea in my head about the best schools and their relative ranking. Any ranking that doesn’t comport with that idea is objectively wrong.”

If that’s your thought process, and btw I’m not being critical of that thought process, then just ignore all rankings. Why chase the “right” ranking when you already know what it is? It makes no sense. Is the confirmation that important?

It’s also funny to to me to see posters (and again, I’m not hitting on anyone here in particular) who consistently and constantly decry the rankings and the very idea of pecking order, and then bolster the validity of their disgust by saying things like, “I mean, how can you have a LSU over a Wake Forest?” Even those people have a pecking order in their heads and they don’t seem to realize that they’re shouting it out loud. So, then, even those who decry the rankings ultimately believe there is a ranking; it’s just not ok for news outlets to publish their own.

7 Likes

Or more selective doesn’t equal better.

2 Likes

The thing that is always interesting about any of these rating discussions is when people compare it against their expectations. The very act of doing so implies some shared acceptance of a hierarchy that exists independent of any ranking. Yet if that is so, what’s the point of any ranking other than to validate that preexisting perception?

And did US News help shape that perception initially or was it always independent of any ranking system (and does that answer vary by the institution)? If shared perception is truly independent of rankings then the rankings seems particularly pointless. Even if we accept their own caveats that they reflect the objective results of a particular methodologies, but then we discount any methodology that doesn’t align with these shared conventional wisdom assumptions, again there’s no point. Which may be why US News and some of the other rankings seem to keep tweaking their methodologies to assure within a reasonable margin of acceptance that their results align with these perceptions, rather than truly letting the objective chips fall where they may. Which begs the question whether the rankings really impact perception or merely that the rankings are tailor made to double down and validate preexisting perceptions. And any that does not do so is immediately dismissed.

Putting the Ivys aside because that brand meant something (more than a sports league) long before US News started its ranking in 1983, what makes us all agree that LACs like Williams, Amherst, Swarthmore, Pomona and Bowdoin (among others) are “Little Ivys” or that UCLA, Berkeley, UMich, UVA, etc. are “public Ivys” etc.?

Have Wesneyan and Reed’s reputations as a full peers of Williams and Amherst gradually slipped a little because they get less love from US News, so at least that one ranking can at least move the needle within a cohort or limited range? Or has it had no effect? In which case, again, the rankings are truly meaningless other than for pointless debate? Clearly the colleges themselves don’t think so, or they wouldn’t invest so much effort into playing the US News ranking game.

Anecdotally, there seems to be a fairly high correlation between admission rates, reputation and conventional wisdom on quality (and US News ranking). It would be interesting to see this validated more systemically. Certainly there are some exception plus there are colleges who successfully game acceptance rates. Though that itself is interesting and suggests that like US News the colleges themselves has determined that acceptance rate is a commonly perceived reference for quality in many student/parents perceptions. If so, it sure would save a lot of trouble not to bother with rankings and just look at acceptance rates as the free market speaking. Or perhaps yield rate is even better, or some simple combination of the two and be done with it.

2 Likes

But for many, selectivity is important, and that is not just due to snobbery. People have very valid reasons for valuing selectivity over other qualities. So when they see ASU ranked well ahead of Vanderbilt they are naturally going to bristle or, better, ignore completely. The level of the student body at ASU is not in the same league as that at Vanderbilt. Some of us are skeptical about the relative importance of some major research effort that is being exercised literally and figuratively at quite a distance from the undergraduate population and experience but value the cohort with which our kids are pursuing their education.

1 Like

Don’t disagree. In the end, any ranking is as good as its criteria.

1 Like

The Little Ivies have more historical substantiation that well precedes what US News thinks, and originally it was a rather specific designation and didn’t include schools outside the NE (just like the Ivy League doesn’t capture Stanford despite its academic reputation).

image

There is also a connection between that term and the Little Three and the Maine Big Three. See below:

What makes the WSJ ranking wrong is not that it doesn’t follow expected prestige/USNWR/selectivity consistently. It is instead that the WSJ rankings is based on a formula of arbitrarily selected weights of arbitrarily selected criteria that is near meaningless to most individuals reading the ranking. Why is the output of such a formula expected to identify the best college? Or even have any objective meaning?

For example suppose a poster on the CC website started a thread saying that they have a formula that can tell you what the best college is: 33% * salary increase for federal FA recipients + 20% graduation rate for federal FA recipients + 17% * (salary increase for federal FA recipients / cost for federal FA recipients) + 20% surveys + 10% diversity. I expect most people on this website would be highly critical of the poster, regardless of what schools did best in this formula. Yet when WSJ does the same thing, it is supposed to correctly identify the best colleges?

I could make a similar comment about the Forbes ranking or almost any college ranking. No formula of arbitrarily selected weightings of arbitrarily selected criteria is going to objectively identify the best college. Having a formula that uses numbers and math might look scientific to some persons, but that does not mean the output of that formula is meaningful or correct. The person creating such a formula can choose weightings and criteria to make whatever types of colleges they want appear on top. I expect most creating such formulas choose the weightings and criteria to maximize profit for the company/website. They want the output to not look drastically wrong to readers, but also want it to be different enough from USNWR to get readers to purchase/subscrbe/click/refer/…

2 Likes

It’s more like, “My school isn’t high enough so I don’t like the ranking.” At least that’s why I complain about them.

1 Like

You’re basically saying A and not A. WSJ ranking is “wrong” because it’s arbitrarily constructed. You say you could say the same things about “almost any college ranking.” So they’re all crap. Fine. But then are individual criteria somehow “right” when all the rankings are “wrong”? Who’s to say? The robust opinion activity on this forum is fairly clear evidence that all manner of people have their own criteria. Are some “wrong” and others “right”?

If a critiquing person says the ranking is wrong because they don’t like the ranking’s criteria for this or that reason, then whose criteria is right? Yours? A group of philosopher kings on college confidential? Basically, when I say your ranking sucks because your criteria sucks, I’m also implying that there is “right” or at least better criteria, and if you employed it, you’d get the right list of schools.

Or are you under the impression that there are people out there in sufficient number that only care about the criteria and blindly ignore how that criteria will translate to the list? They are one in the same. If I value research above all, I’m going to expect the usual flagships and other research juggernauts to make up the top group on the list. Prestige, an altered list will follow. Finance and management bro production? New list. Class size and undergraduate focus? A completely different list. PhD production? We could go on for days. The criteria begets the list, and the criteria people value corresponds to the schools they value. Criteria = list. List = criteria.

Dude, the difference is that people generally realize that their own ranking is their own opinion. They mostly organize colleges into vague groups of "these are great colleges ins general, these colleges are good for these majors, these colleges hare really good deals, etc.

These rankings are all trying to claim that their ranking is The Only One True Ranking System, which accurately ranks every colleges from Best To Worst

When WSJ, Forbes, or USNews, or Times, or Better Homes and Gardens posts a list, they all have the same claims:

"This list was created using the latest in data analytics methods, the most advanced AI technology, and are the most objective, unbiased, and pure ranking the the WORLD. It accurately and scientifically determines the true quality of every single college.

We used Data and Science and Analytics and Statistics and Models in our methodology, and we included all the Important Data, unlike all the other rankings which forget to put in Important Stuff, unlike US.

Choose OUR rankings, because we are Superior, and the only ranking which will help you find the Very Best College For Your Kid. There is truly only One Way to rank colleges, and that is Our Way. There is only one way to determine which data to use, only one way to determine which factors to include, only one way to weight each factor, and only One True Model for College Excellence, and that is Our Way. There can only be one Best Way to do this.

There Can Be Only ONE!!!"

2 Likes

I didn’t say individual criteria is right in my post, but I would agree that some it is important to review criteria that is important to a particular individual when comparing colleges. For example, it’s important to consider how much a particular college will cost after FA/scholarships/… when comparing colleges, whether a particular college offers a quality program in your desired major, etc. However, this does not mean a website formula that is composed of 23% average cost + 17% average salary + … is going to correctly identify the best college.

If a critiquing person says the ranking is wrong because they don’t like the ranking’s criteria for this or that reason, then whose criteria is right? Yours? A group of philosopher kings on college confidential? Basically, when I say your ranking sucks because your criteria sucks, I’m also implying that there is “right” or at least better criteria, and if you employed it, you’d get the right list of schools.

Many of the WSJ criteria do have critical flaws, which have been discussed. However, the root of the problem is not that the ranking criteria is wrong. It’s the concept of expecting a formula using arbitrarily selected weightings of arbitrarily selected criteria to correctly identify the best college.

Some concepts lend themselves well to formulas or ranking by formula. Two important criteria for a successful ranking formula are to have a clear definition of what you are ranking, and have a way of verifying or validating whether your ranking is more/less accurate. For example, college football lends itself well to ranking. There is a clear definition of criteria to be ranked against – ability to win football games, and one can verify how well the formula predicts ability to win football games, based on season game results.

However, if you don’t have a clear definition of what you are ranking (how is “best college” defined?) and don’t have a way to verify or validate the accuracy of your ranking formula, the formula should be considered for entertainment purposes… or as a means for increasing profit for your website/company, which I expect is the primary purpose of best college rankings formulas.