Tulane crushes it in the US News Rankings

Tied for 54 in 2015. Tied for 41 in the new 2016 ranking out today.

Have not studied it, but the jump most likely is because the extremely important 6 year graduation rate finally has started to reflect the post-Katrina new normal.

76% 6 year grad rate in last year’s data set, which was heavily Katrina influenced and really a bad outlier metric when compared with TU’s other metrics. 83% for this year. That’s a massive move.

I was thinking TU might crack the top 50 this year because of the Katrina effect finally wearing off. Moving to 41 suggests there must be some other good stuff going on in addition.

But we really don’t care about superficial things like rankings…

That is the best news ever. I just ran down my hall to tell other kids about it. This is awesome!

EDIT: And the data only gets better right? It might slip a position or two next year since it’s tied with so many schools, but it’s not going back.

Look, as Fallenchemist and others have commented over the years, the thing is basically a crock. However, nobody denies that many families rely on it. The upshot is that Tulane will have a very strong freshman class next year. If the yield went up a point this past year, with this it’s going up 3-5 points. It will be good for the school, as shallow, arbitrary and ultimately irrelevant the underlying data may be.

@NJDad68 That’s the best way to put it. I don’t care about rankings personally, but this is like a massive PR boost for the school.

Agreed. The rankings are silly, but nonetheless its nice to see the improvement in ranking.

All I can say is that @northwesty and I nailed it!! Yeah, we are bragging, but not about the jump itself. As long as USNWR didn’t significantly change the formula (and apparently they did not) it was as simple as 7th grade algebra. USNWR weights graduation rates heavily, as compared to, say, admission rates which are barely a factor. They also do it in such a way that graduation rates from 6 years after Katrina were still part of the “averaging” they did on this factor until this year, as @northwesty says. Finally, because of the way they also do a “predicted” rate and judge against that, graduation rates are almost double counted. In fact I think, but don’t recall for sure, that there is still some Katrina effect in one of the ways they count it. Another year before that goes away totally, so still some slight upside. But we have been saying for about 3 years to be patient, that Tulane will make this jump. 2 years ago I think I said mid 40’s so this is a bigger jump than I thought, but of course one cannot know how other schools will move.

So while I will maintain to my dying day that these rankings do far more harm than good and that I would call them garbage even if Tulane hit #1, I will concede that it is good PR and that it is a relief that it will quiet some people. Congrats @northwesty, you definitely were out front on this.

I just looked at their page on how they did this year’s calculations. The only changes were relatively insignificant. They are now using a two year averaging on the peer assessment score instead of only the most recent, and a three year average of guidance counselor ratings instead of two. Personally I find both these metrics to be very weak and nearly worthless, but that is another story.

The peer & HS counselor poll is almost as objective as voting for Homecoming Queen

FYI
http://talk.collegeconfidential.com/parents-forum/1809465-usnwr-ranking-methodology-the-nuts-bolts-or-is-it-just-nuts.html#latest

Even though I now know the rankings are meaningless, it’s still nice to rub it int he faces of all those people who bragged about PSU being ranked higher.

I guess the thirsty push marketing heavily for applications is working for a ranking “boost”

Rankings aren’t meaningless – they measure what they measure – and much of the data they compile are totally solid. They bring a little Moneyball-type analytical rigor to what otherwise is a meandering cocktail party conversation.

Tulane’s case has been interesting for the past several years because of a data anomaly – one very important (and totally legit) metric being severely impacted for a long time by a one off event. That kind of fluke pretty much never happens. Now that the anomaly is fading out of the data, Tulane is pretty much where you would expect it to be.

If you are a kid/family shopping for a midsize private urban college of a certain selectivity, the ranking list will identify some candidates for you – #37 Case, #41 BU, #41 Tulane, #47 Northeastern, #51 Miami. If you want to reach a bit higher, then the rankings would steer you to considering, say, BC, USC, Emory, NYU or Vandy. Pretty reasonable/logical to me, not random.

FWIW, the acceptance rate is a very small piece of the formula. Much more important are the qualifications of the students who actually enroll, and what the outcomes for those students are (i.e. retention rates and graduation rates).

It’s a shame Tulane was so punished by the Katrina effect. If anything USNWR should have given them a pass. Infact, USNWR should have given them a boost for the incredible effort to rise from Katrina. The rankings have been frustrating the last few years considering the cost and quality of education my child is receiving at Tulane. I know, I know it’s meaningless but it’s still respected and used as a means of comparison. Go down that list and you’ll see Tulane is a top 25 school and Fitts will get them there. No way USC, Wake and a hord of other should be far ahead of Tulane…Go Wave!!

“The rankings have been frustrating the last few years considering the cost and quality of education my child is receiving at Tulane.”

SoCal – Be careful what you wish for. Since my kid is about to graduate, it’s all good that TU’s ranking is now going up. Would be more of a mixed bag if I had a kid in HS interested in TU. Since as the ranking goes up, the school gets harder to get into and harder to pay for. That nice Tulane merit aid is more of a top 50 school thing; good luck finding merit aid in the top 30.

You and I have always disagreed about that, which is fine of course. OK, I guess I wasn’t the one that used the word meaningless, but as people tend to use the rankings that description isn’t far off to me. I find it very irritating at best that USNWR has gone to a lot of trouble the last few years to say all the right things after the sensational headline of “THE BEST COLLEGE IS…” such as “just use this as a tool”, “nothing can really measure atmosphere”, etc. Sure, if the USNWR formula just happens to measure the exact parameters you would use on your own, or even comes close, then it is useful. I just find that unlikely. It creates a completely false sense of “prestige whoring” that has infected people worldwide, has led to institutions making policy decisions that are based on what is best for the ranking game and not the school itself (unless you believe that being higher in the rankings is a worthy goal in and of itself, which rather proves my point), and of course the various instances of actual cheating and lying, including Tulane but only for the MBA rankings by one rogue employee. There are various instances of cases for the undergrad rankings, and confessions of others that they purposely game their peer reviews to benefit themselves within their competitive group.

Data is data, so why not place the data they do gather and put it in tables for people to make use of, rather than creating an artificial formula with artificial weighting factors and trumpeting the false notion of a “best school” or even a general ranking of these schools. I do agree that the list is roughly correlated to selectivity, maybe even more than roughly with the exception of schools like UC Davis and a few others. For years Davis had a 25/75 SAT of around 520-620, well below Tulane’s but they still outranked them. Same for some other UC schools. It is because of the way the UC’s choose to report their data, which is not the same as most other schools do it yet USNWR refuses to do anything about it. I believe their reported scores have improved lately, but I have extreme skepticism regarding what they report. In any case, one can easily look at a list of selectivity by academic scores, percent admitted, or some combination of those two without throwing in everything else USNWR does. If that is all that is important to someone, or it is their primary place to start narrowing down the list, that’s fine.

BTW, the reference above to a “marketing push” is nonsense. Two reasons. Tulane started marketing very heavily years ago and it obviously did them no good in the rankings. Second, and more to the point, this increase is virtually all due to the retention and graduations rates going up. A lot. Especially compared to how USNWR was calculating them using Katrina data when the rates were low to not even reported. Marketing to potential freshmen has nothing to do with that. It is an absurd statement.

So not trying to rain on this parade, but bottom line: is Tulane substantively different today than it was last year when it was 54, or the year before that at 52, etc. etc.? Of course not. Granted it is a particularly singular case because of Katrina and the data issue that USNWR refused to recognize or adjust for. But the things that have truly gotten better about Tulane IMO (which is the whole point, it might not be someone else’s opinion yet it has a ton to do with the school and whether you want to go there) have to do far more with service, with new majors, with changes in the dorms, and hundreds of other things since Katrina and these things are not measured. Sure some of these might get reflected in better retention and graduation rates, and move them in the USNWR rankings some, in theory. Or not. That is hard to predict. The time it takes for these things to get reflected in outside assessment surveys is notoriously long, assuming those people even recognize them as improvements. Said in a simpler way, reputations die hard.

I would still fail to see how a “one size fits all” formula, translated into convenient list form obviously designed to serve some superficial psychological need and to sell magazines, is truly useful to most. The psych need I refer to is that well known marketing concept of making the consumer feel better about their choices, both pre and post sale. But to be PERFECTLY CLEAR, I am not oblivious to the PR reality. If this helps Tulane in some sort of positive vicious cycle (better ranking gets even better students which leads to better rankings, etc.) then great. Not because it keeps moving them up in the ranking per se but because they get academically more talented students which, if coupled with their focus on getting a critical mass of students that are also dedicated to service, is IMO a good thing. Not the first time the right thing happens for the wrong reasons. But if I were given the choice between that result or doing away with the rankings, I would choose the latter. Publish the data, by all means. But stop with this ridiculous GIGO that they do with the data.

Here’s why rankings should neither be ballyhooed nor berated:

Let’s say US News has figured out most of the variables we should consider when judging the overall quality of an undergraduate education.

Just grant them that for a second.

Now… how do you weight those variables?

That’s the rub: what I think is most important, you might not. Thus, even if we agreed on all the factors, we might all produce different rankings because we did not consider them to be of exactly the same importance.

So – it would not be terribly difficult for each of us to make our own rankings based on the US News formula’s structure… but tweaking the percentages as we see fit.

Hence my call for publishing the data tables only. Now, not to push too hard on your “theoretical”, but the data would have to be clean, meaning collected and reported in exactly the same manner by and from each school. As far as the surveys, well IMO that is hopeless, but that’s another argument.

@fallenchemist - well said. I would strongly prefer the data points and assign weights myself to make a ranking meaningful for my son and his search.

Northwesty,
Rice gives great merit aid.

“Northwesty,
Rice gives great merit aid.”

Sorta, but not so much.

Although there’s always exceptions, the business model is that your merit aid program generally declines as your ranking rises. Also, the higher you go, the stronger you have to be to get an award. Since the merit money flows primarily to those who are above average applicants. As follows:

Case – 58% of students get merit aid; average award is $21k.
Tulane – 39%, $21k.
Miami – 31%, $20k.

One level up:

USC – 27%, $18k
Rice – 25%, $12k
Wake – 18%, $17k
BC – 3%, $18k
Emory – 6%, $18k
WUSTL – 18%, $7.5k

I think USC has a fair amount of merit aid as well, and I do not think it has decreased just because they have risen dramatically in the rankings. In any case, speculating on what Tulane may or may not do should they continue to rise in the rankings is just that, speculation.

To the best of my knowledge, most of the private schools in upper part of those rankings haven’t really changed their policies over the years, they simply never had great merit aid, if at all. To prove your point you would have to show that these schools actually changed their policy from when they were ranked lower, if they ever were.

Clearly, though, most if not all of those schools with huge endowments and high rankings have dramatically improved their need based aid and are far more affordable to middle class and lower families than when these rankings started. Under the spirit of the proposed theory, why would they do that if they are already at the top of the heap?

Edit - we cross posted and you included USC, and even though you put them a tier above Tulane (rightly so in the ranking scheme) I would say their merit aid is pretty close to the Tulane/Miami model. Again, the key is are they now going to decrease it since they might perceive they no longer “need” this kind of help getting top students? I rather doubt it myself, but we will see.