I’m not going to Tulane, nor did I apply, but I found its situation odd. It’s a selective university (27% acceptance rate I think?), has high admission standards, and in general just has the markings of a higher ranked institution, yet it’s ranked 54 on USNWR. Yes I know that USNWR isn’t a god when it comes to determining universities, and that 54 is still actually pretty high, but still, seems like Tulane should be higher.
Well, I agree in the sense that if rankings had any validity Tulane should probably be higher. But by even saying that, I am presuming some common definition for what we are trying to measure. In truth, you have to get really philosophical and a bit abstract and ask questions like “What does it mean to be a great university” and “What is it exactly one is trying to rank”? That is why USNWR and every other ranking is worthless, unless you are trying to rank something very well defined and easily measured. So to be a bit trivial, if I only wanted to rank schools by size of endowment, that would be very easy and quite reliable, but it really wouldn’t tell us a lot about whether that was the school we wanted to choose. Same for if we only wanted to measure acceptance rate. Or is that really true? Is it fair to compare a school of 500 per incoming class that does essentially no advertising and so gets maybe 4,500 applications at $75 a pop, accepts 2,250 (50% rate) and enrolls the 500, to a school that has free applications, advertises heavily, gets 40,000 submissions, accepts 10,000 (25%) and enrolls 1500? Is acceptance rate really telling you anything about the quality of either school? Obviously the 2nd school is almost exactly what Tulane does and I think there is absolutely nothing wrong with that. In fact, when you look at their strategy in total, including the early notification, the large merit awards for many students, and several other key tactics, I think it is brilliant and has served the school extremely well post-Katrina.
So even the simplest comparisons can get complicated. Then you start trying to use multiple factors, as USNWR does, and you have to ask for each and every one “Is this really telling me anything about the quality of the undergraduate experience at that school?” Are we only talking about academics? Most college students I know would reject that notion completely. They would say that the campus atmosphere, the social life, the opportunities provided by the location, the service aspects, the research opportunities, the accessibility of the faculty, and many other factors are part of the equation, in whole or in parts. And it won’t be the same for any two students. So how does one deal with that? What is the value of going to school in New Orleans versus Davis, California or Winston-Salem, NC? USNWR doesn’t address that at all. And of course it cannot. So I think that it doesn’t take long to realize that the entire premise of trying to measure, and therefore rank, something that is as grand, and nebulous, a concept as what is the “best” school for undergraduate attendance is ridiculous and should never have been presented as such.
I used to say that the better students are mostly looking to be around other smart students. And I still think that is true in general. But I have come to realize that it oversimplifies it somewhat as well. But to the degree that it is true, you can look at test scores, high school GPA and class rank, AP or similar courses taken, and get some idea if that school has generally sharp students that are reasonably serious about academics or not. In general, on average, because that is all you can say. But again, that is a somewhat one-dimensional way of looking at it, even if that dimension is one of the more important ones.
So after that meandering trip through why your question probably shouldn’t even need to be answered, I will try anyway. Because of course in the current world a fair number do look at the rankings (USNWR anyway), and administrations do pay at least a little attention to it, although I wonder how much Tulane still worries about it since the school seems to be doing quite well with admissions despite the lack of love from those editorial bastards.
(To be continued)
I always thought that graduation rates, more than most other factors, might be dragging Tulane down. Since Katrina, Tulane took a real hit on 6 year graduation rates, and USNWR values this pretty highly. I forget the exact percentage it plays into the final calculation, but unlike admission rate (1.25%) it is significant. Much more than 1.25%. Northwesty analyzes this quite well in this thread that NJDad68 recalled: http://talk.collegeconfidential.com/tulane-university/1684431-rankings-p1.html Katrina was in 2005, so any class that entered in the couple of years before that took a hit on graduation rates as people decided to transfer rather than stick it out, and even the next class or two probably suffered to some extent. As USNWR says, this years rankings take into account the classes that began in 2004-2007. In Tulane’s case, the 2005 cohort was not just bad, it was an N/A. It really hurt. Anyway, the 2007 cohort was 76%. But we can now see the 6 year rates improving dramatically. The 2008 cohort, which will be counted in the rankings coming up, was 83%. It will be interesting to see what kind of difference that makes, and of course if Tulane sees similar improvements in the next 2-3 years as the 2005-7 numbers fade away, the impact should be significant.
Another factor that is heavily weighted is peer assessment. This could still be Katrina affected, because so much of the publicity at that time was about Tulane closing (it was only closed for a semester, which was a miracle in itself, but there were an amazing number of people that thought the school was still closed 2 years later. There just wasn’t the same kind of national publicity about it reopening so quickly). Also, there was a lot of sturm and drang, especially within academia, when Tulane closed several departments completely such as civil, mechanical and electrical engineering and computer science. They also closed a number of Ph.D. programs. I think this probably hit Tulane hard in the PR sense, although for undergraduates not in those majors the reorganization has proven to be a serious improvement in their experience, since there are more resources for the remaining programs and more focus on them as well.
Also along similar psychological lines, Tulane is a great research institution but in many areas is far more undergraduate oriented than graduate school centric. That would be in contrast to most of the top 50 schools on the list. I think that despite the fact that this USNWR survey is supposed to be an assessment of undergraduate reputation, I would be shocked if the people doing the assessment are able to separate out one from the other. And as I think nearly everyone that has gone to graduate school would tell you (not med, law or MBA but graduate school. Those are professional schools), undergrad and grad school are totally different. Like night and day. So I think that Tulane might suffer to some degree in this category, more than I think an objective and honest assessment of the undergraduate experience would indicate that it should.
It’s a bit odd, because Tulane undergrads do spectacularly when applying to the top grad schools (and professional schools) precisely because they have such a strong undergraduate experience, especially with a great many of them doing high level research and a thesis. There is a bit of a disconnect there, but nonetheless I think it exists when these people fill out the surveys. Also these surveys have been shown to have regional biases, and Tulane’s relatively isolated location (relative to other major schools that is) is certainly a contrast to the Northeast/Mid-Atlantic and California. Obviously there are other schools that are also a bit alone out there, such as Michigan, but its sheer size and the amount of publicity it gets both academically and non-academically easily overcomes that. Plus of course UM has extensive graduate programs in virtually every field.
Then you get into factors like the endowment, which is healthy at Tulane (about $1 billion) compared to national averages but is well short of schools like WUSTL, Duke, Vandy, so on and so forth. The very schools it should be somewhat closer to in the rankings. I won’t go into all the issues associated with getting the endowment up. Naturally the school is trying to do exactly that. But it is a factor, although again I am being lazy and not looking up the exact weighting USNWR gives to that.
So, my apologies for the dissertation. You got me thinking about this in some detail after not thinking about it so much for awhile. The bottom line is, rankings be damned, Tulane is not just surviving but thriving. And while there is a human side that likes the affirmation that seeing a better ranking would (falsely) give, we have to hold on to the logic and experience that tells us that these rankings are, in fact, just so much nonsense that really tell students little about what it is to be an undergraduate at Harvard vs. Notre Dame vs. UC-Irvine vs. Tulane vs. Alabama vs. you name the school. Unless, by some miracle, your own personal value system exactly or nearly matches the way USNWR looks at the world of undergraduate institutions. I would hope no one is that shallow.
Whew!
b**
There was some very good statistical analysis and commentary in this forum around the time that the latest rankings came out. Here it is: http://talk.collegeconfidential.com/tulane-university/1684431-rankings-p1.html
@fallenchemist thanks for the essay I thought your analysis and actual answer were both very interesting.
@NJDad68 thank you for the link!
Thanks, glad you were able to wade through it. Please take note that I significantly changed my assessment of the graduation rate effect, first paragraph of post #2. In that same paragraph, I also added some more recent data to what you will find in northwesty’s extensive analysis in the link. So I think when you add that to the other things I mention, it is easy to see why the USNWR methodology is not very reflective of the current state of Tulane. Of course, I still stand by my overall disdain for the rankings, even after the Katrina effect is mostly weaned out.
The six year graduation rate data goes back to the kids who started college ELEVEN years ago. It is an extremely lagging metric. For purposes of the 2015 USNWR ranking, Katrina just happened this year.
The biggest factors in having a good 6 year grad rate are (i) having selective admissions (which Tulane has) and (ii) having a good freshman retention rate. That data lags much less. For this year’s USNWR formula, the freshman retention data is based on the classes that started in the Fall of 2009, 2010, 2011 and 2012. TU has a much improved 90% frosh retention rate for this group. For this metric, Katrina happened a few years ago and is fading away fast.
“A big piece of the formula (18%) is tied to six year graduation rate. This is what kills TU – 76%. By comparison, USC is 91%, NYU 84%, Miami 82%. This stat currently combines data for the freshmen classes that started in the Fall of 2004, 2005, 2006 and 2007. So very much a trailing/lagging metric. FYI, Katrina was Fall 2005. So it will be 2-4 more years until the lingering effects of Katrina are completely out of the data set.”
Tulane has an absolutely abysmal yield… That’s not a plus.
It’s instructive to understand the weighting of criteria that goes into the USNWR rankings.
Two factors make up 1/3 of a college’s score:
- 22.5% of a school’s rank is determined by the outcome of a poll of HS guidance counselors + college admissions deans, provosts & presidents. It’s about as objective as an election for Homecoming Queen.
- 10% of the ranking is determined by “financial resources per student”.
‘Nerd’ components of a college’ ranking only comprise 1/9 of a college’s score:
- 8.1% SAT/ACT scores
- 3.1% highschool top 10% GPA
A college’s admit rate only counts 1.25%
Just like in high school, the Queen Bee girl with the richest daddy always gets elected Homecoming Queen.
Yield is not a minus, either. Yield is absolutely not a factor in the USNWR rankings. Besides, in this age of the common app, heavy advertising and 12-20 applications per student being common, yield has ceased to be even an iota of an indicator. Similar reasoning should tell USNWR to stop using admission rate as well, even the 1.25% they still count it. But then, virtually every factor they use is bogus for what they claim they are measuring, so why pick on any particular one?
Would it be fair to say that the graduation rate is even more than the 18% you indicated? According to the USNWR metrics page http://www.usnews.com/education/best-colleges/articles/2014/09/08/how-us-news-calculated-the-2015-best-colleges-rankings?page=4
I find this to be the most laughable of all the things that USNWR does, because it takes an arbitrary formula completely invented by USNWR and uses it to say that a school should be graduating students at a certain rate. What’s the formula? How are they so omniscient? The arrogance is overwhelming. But as it pertains to this useless annual exercise, Tulane is being punished yet again for the Katrina related shortfall in graduation rate. Presumably this will be less of an issue this year since they will be looking at the 2008 cohort, which improved to 83% as I mentioned. Still, this looks to me like another “magic ingredient” USNWR uses to cook the rankings to meet their preconceived notions of what they should look like. It certainly is hard to understand and entirely invented.
TU’s lower than expected USNWR ranking is 100% COMPLETELY TOTALLY SOLELY ENITRELY due to its six year grad rate. Which is still hugely Katrina affected. It is not due to ANY other factor.
TU was ranked #54 overall with its 76% 6 year grad rate. Which is HORRIBLE. Every other school in the top 54 had a 6 year grad rate that began with an 8 or a 9.
6 year grad rate is 25.5% of the formula. It is the single worst thing to be bad at in the eyes of USNWR. Which TU is because of the lingering Katrina affects. Which (due to the lag time of the data) are currently peaking in the USNWR formula.
18% of the formula is straight 6 year grad rate. The school’s under/over performance on 6 year grad rate (as FC points out) is another 7.5% of the formula.
TU (based on its student demos) should have an 82% six year grad rate. So it under-performs by 6. Again HORRIBLE. Only TU and GWU had a minus 6 in the top 75 schools.
54 TU is doing fine with its peers (3.4 score) and HS counsellors (4.2 score). Compare #48 Miami (3.2, 4.0), #42 BU (3.5, 4.2), #40 Lehigh (3.3, 4.1).
Once Katrina fades out of the grad rate data set in 3-4 years, my kid’s TU education will be 5-10 slots fancier than it is today.
One last swing to make sure the horse is completely dead.
40 Lehigh has a ranking of 33 on graduation/retention from USNWR.
42 BU has a ranking of 40 on grad/ret.
48 Miami has a ranking of 62 on grad/ret.
54 TU's Katrina affected grad rate is 76% and -6 underperformance to expected performance. That translates into a ranking of 86. That stat is very out of spec.
FC reports above that TU’s 2008 starting class had an 83% 6 year grad rate. If that is a normal sustainable number, here’s where TU would compare on that most important rankings indicator in 3-4 years time:
Lehigh is 86% and -5. BU is 84% and even. Miami is 82% and -4.
Tulane would be 83% and +1.
Actually the way I read the USNWR explanation of that metric, unlike the 6 year graduation rate itself they only use the latest cohort for this measure. So for the current rankings:
They are pretty specific about only using that one year, while in the 6 year graduation rate explanation they specify that they used the 2004-2007 cohorts. So I would presume that for the upcoming rankings they will use the 2008 cohort for this measure and the 2005-2008 cohorts for the other. The 2008 cohort is the one that schools have reported in their 2014-2015 CDS, so that is the most current USNWR will have available to them. So the 83% graduation rate that Tulane reported should be fully realized for this 7.5% factor, while only partially helping the 18% they weight the 6 year rate itself. I still don’t know what they do about Katrina and the N/A report for the 2005 cohort. Maybe for Tulane they will just use 2006-2008. No idea. But at least this year will be the last where it is a direct factor in that sense.
FC – That does seem to be how the formula will work. Katrina will be out of the data for the metric that accounts for s 7.5% of the formula, but still in the 18% metric to some extent.
So maybe we will see a bit of a rankings bump this September…
Since the 14-15 CDS data is already out, here’s the data USNWR will use for the next round for #54 TU and #48 Miami:
2008 class 6 year grad rate: TU 83% vs UM 81%.
2005-2008 average 6 year grad rate: TU 78% vs UM 80.5%
Frosh retention: TU 92% vs UM 93%
ACT range: TU 29-32 vs. UM 28-32
SAT range: TU 1230-1410 vs UM 1220-1420
Students in top 10% of HS class: TU 56% vs UM 66%
Acceptance rate: TU 28% vs UM 38%
I’d guess that TU moves up a few slots but stays behind Miami. You still have Katrina in 2 of the 3 years for the 6 year grad rates.
If somebody could clarify, what exactly is 6 year graduation rate? The percentage of students that graduate within 6 years?
@gdlt234 - Exactly. They do track 4 year rates as well at the Dept. of Educ. but it is generally agreed that this is not the best thing to look at. While most degrees are conventionally set to be achieved in 4 years, there are several that are set at 5. For either track, when you take into account the fact that people change majors, take a year off for any number of reasons, and other assorted “life happens” reasons, the 6 year metric is considered indicative of the propensity of students that start at any given school to finish with a degree of at least a bachelors at that same school.
@fallenchemist ah I see. Thank you
I could care less what USN&WR thinks, I’m applying to their history program for graduate school.