WSJ College Rankings

Just got the WSJ paper this morning (yes I still read the newspaper!) and they have a special section. Here are some highlights:

Top 10 Public Schools

  1. University of Florida
  2. New Jersey Institute of Technology
  3. University of Michigan - Ann Arbor
  4. Florida International University
  5. University of Illinois - Urbana - Champaign
  6. Texas A&M College Station
  7. Georgia Institute of Technology
  8. University of Utah
  9. University of Connecticut
  10. Baruch College

Top 10 Preparation for Career (there are 11 though, lol)

  1. Babson College (three tied for first)
  2. California Baptist University
  3. Washington & Lee
  4. Brigham Young University
  5. Rose-Hulman Institute of Technology
  6. Claremont McKenna
  7. Florida Agricultural and Mechanical University
  8. Texas Christian University
  9. The Master’s University
  10. Samford University
  11. Savannah State University

Edit: there are lots of ties in the WSJ numbering but CC is adding them sequentially.

1 Like

Yeah, it is much healthier to see rankings lists as leads, and not final answers.

Among other things, you can then get different leads from different methodologies.

2 Likes

A lot of the “head scratching” falls/rises in rankings vs the USNWR historically prestige perception driven rankings has to do with the WSJ methodology of trying to measure relative outcomes against expected outcomes for a given demographic. Schools that are comprised of a high percentage of demographically privileged students (kids that start at the 20 yard line) have a much higher hurdle to beat. Brown only scored a 75 on the Salary Impact measure. Providence scored a 92 and Univ of RI scored an 80. Maybe what this is indicating is that Brown graduates tend to be less pre-professional than their counterparts at other Ivies (P:99; Y:96; Col:98; H:97; P:99; D:96; Cor:88), although the difference is pretty striking.

COLA of the state also has a impact that is rarely measured in other rankings. In the years to pay off net price calculation, it is not just a straight net price comparison but compares the median earnings of that college grads vs high school grads in that state, and uses that difference as the denominator in calculating the amount of time it takes to pay off 4 years of that college at an average net price.

3 Likes

Pure speculation on my part, but I wonder if less pre-professional students might also lead to at least a little higher benchmark per that Brookings-informed model, creating a difference squeeze from both ends.

Raise your hand if you’re choosing NJIT over Michigan or FIU over Georgia Tech?

Maybe if you are NMF, but I’m not sure those sets even get the same applications.

Again, not saying WSJ is wrong - no clue how to figure out what the right criteria is and how to adjust it for different cost of living scenarios - but of course, no one will trust it.

Doesn’t make it bad. Some people might say - you know - I can save a boat load of money and actually do just fine - and that’s fair.

I suspect the vast majority of readers of the WSJ college rankings list are not poring over the methodology to determine how they got their results. They will just look at the results at its face value and their own notion of “best college.” If they don’t believe it, they will probably consider the entire ranking list trash. If there is something that they like about the list, they will be more inclined to accept the results (or at least a portion of it). I guess that also applies to every other ranking list.

The methodology is BS. There, I said it.

First, because it’s WSJ, “outcomes” are seen purely in dollar signs, because for the good people at WSJ, only money matters. So colleges that produce a huge number of academics are going to be ranked much lower than colleges that produce a large number of hedge fund managers. Does that make the former “worse” by any measure? Only in the money money money money world of the people who created this ranking system

Second,

Exactly how did they decide “what those students were likely to achieve”? I’m sure they have their “model”, but I’m sorry, any model which claims to know what the alternative present would be for any student is postulating out of their posterior. This is exactly the type of factor that is easily manipulated to get the results that the people who created the rankings wanted, which is keeping the elite colleges at the very top.

Third, I’m interested as to where they got their salary data from. I could not find that, and this data is always suspect.

Then there is diversity.

How did they get “economic diversity”? They don’t explain, and it’s another factor that I find suspect. I don’t see why Stanford gets a high ranking on this compared to, say, Harvard.

Then of course, the BS of:

Exactly how did they solicit these questions, and who are the kids responding? I am 100% certain that many of the colleges with low rankings this year have them because they don’t have enough students who responded. This is another place, like “student income in an alternative universe” that is easily manipulated to achieve the results that the people who created the rankings wanted.

But they used DATA, they used POWERFUL STATISTICAL TOOLS, right?

The WSJ, by dint of immense effort and calculation, managed to produce a ranking that is even worse than USNews.

I would trash them more, but I refuse to have to pay for a subscription in order to access that garbage.

10 Likes

Which is exactly the point that drives the USNWR rankings, and a big reason for their overwhelming success. Their internal mandate, from the start, has been to keep those colleges at the top that you’d expect to see at the top based on preconceived notions of which ones are the best. Hence the list has credibility in the minds of people. And then they tweak things every few years to keep things interesting and shuffle a few colleges just a little bit, but always making sure the top 10 doesn’t get affected much.

2 Likes

But it gets to quality of student (and overall learning environment)


1 Like

Perhaps the other way around, since graduates immediately go on to military officer jobs.

It’s why I asked. Do these rankings recognize military service as employment? I honestly don’t know. I think they should.

There is a state by state comparison so my interpretation is that in Louisiana going to Tulane and working in LA afterwards the salary bump from a Tulane degree is not as high , whereas Lehigh in PA relative to PA high school grads gets a significant bump again relative to others in PA. I disagree with most saying this is a bogus ranking, it actually is an approach the conveys relative value of the degree. I offers a different perspective than other ranking methodologies which is ok. I agree with some posts that the methodology favors computer science heavy schools, but no ranking it perfect. As for the comments regarding political motivation may be intentional to give state schools and others a second look vs the usual ivy driven approach. Again those are for the 1% anyway. I would also add another metric which NY Times did many years ago and that is the wealth of parents. For example ,Wake Forest or Middlebury and overlay that into outcomes because the parent network (friends etc.) probably has a spillover effect to their offspring hence a better outcome like getting your kid a higher paying job from a friend that owes you a favor.

It is interesting though schools like Hopkins vs a Wash. U or Vanderbilt the disparity is stark.

This methodology is very odd - how is it that schools with fairly low graduation rates are ranked so highly? Is there any consideration of major for salary outcomes? It is such a mixed group of schools that it is hard to compare. And what is the explanation for a school such as Tufts dropping over 200 spots in their rankings?

2 Likes

Tulane is running some kind of racket.

Just because you don’t like the methodology, doesn’t necessarily mean it’s wrong. On another topic; money rules the world. We may not like it, but it does

3 Likes

Military service IS employment. How could anyone think otherwise? Every enlisted/officer is doing a job that aligns with civilian work (mechanic, pilot, accountant, doctor, linguist, law enforcement, translator, cook, chemist, lawyer, logistics, programmer, etc.). Our son manages a team of developers/hackers. On what planet are any of these not “employment?”

(Sorry, off topic, but sheesh)

1 Like

From WSJ regarding methodology
70% - student outcomes which includes salary differential/bump and relative debt
20% - learning environment/student experience
10% - diversity
So I would say Middlebury was unaffected by the diversity metric
Middlebury #131 for overall rank, #179 for salary impact, #285 for social mobility, #195 student experience

Contrast to other Liberal Arts Schools
Amherst #8 for overall rank, #19 for salary impact, #54 for social mobility, #41 student experience

Bowdoin #89 for overall rank, #138 for salary impact, #216 for social mobility, #95 student experience

Williams #31 for overall rank, #76 for salary impact, #116 for social mobility, #171 student experience

Colgate #40 for overall rank, #21 for salary impact, #348 for social mobility, #268 student experience

Hamilton #88 for overall rank, #97 for salary impact, #203 for social mobility, #257student experience

Pomona #49 for overall rank, #145 for salary impact, #136 for social mobility, #9 student experience

Claremont McKenna #9 for overall rank, #8 for salary impact, #125 for social mobility, #60 student experience

My interpretation, take a wealthy student body as defined by parental wealth would have a poor social mobility score. A wealthy kid goes they will stay wealthy a poor kid goes they won’t move up the chain.

Tufts is #318 for salary impact and even lower for social mobility. It’s methodology and metrics driven with the all important “prestige” factor all but eliminated. Watch those applicant numbers drop for tufts

So looking at the Tufts score breakdown vs Wesleyan (73):

Under Student Outcomes
Salary Impact 50 vs 82
Graduation Rate 75 vs 90
Net Price Payoff 3Y 2M vs 2Y 4M
Avg Net Price 35,683 vs $26,080
Value added (diff btwn grad earning and HS students for that state) $44,430 vs $43,547

So based on the data and the criteria that the WSJ was using, Tufts students as a whole were significantly under indexing expected earnings and graduation rates based on their student body demographics. The avg net price was also significantly lower while the value add was less than $1,000.

2 Likes

Just using Tufts as an example, here is WSJ explanation of graduation rate methodology, not clear what this means: “Graduation rate versus similar colleges (20%): This is a measure of a college’s performance in ensuring that its students graduate, beyond what would have been expected of the students regardless of which college they attended. We used statistical modeling to estimate what we would expect a college’s graduation rate to be on the basis of the demographic profile of its students, taking into account the factors that best predict graduation rates. We then scored the college on its performance against that estimate. These scores were then combined with scores for raw graduation rates to factor in absolute performance alongside performance relative to our estimates.”
Tufts graduation rate not ranking compared to similar schools is in the mid-90’s, how can that then be scored as a 75?