<p>Yesterday, Bloomberg published its college rankings based on their return on investment. Tulane made a good showing at no. 96. Here is the link: What's</a> Your College Degree Worth? - BusinessWeek</p>
<p>Agreed that is not bad, but the graduation rate of 74% is highly Katrina affected. It is unfortunate they caught the worst year of the data resulting from Katrina. I only glanced at the article, so I don’t know how that affects the results, but that caught my eye right away.</p>
<p>Geez. Hope the bill-paying husband doesn’t take a look at that. Certainly don’t need to spell out that it’s the 2nd most expensive college on that list behind RPI.</p>
<p>Yes, Tulane has been one of the 5 most expensive for a few years at least, at least in non-discounted costs. I rather wonder what it actually is after merit and FA scholarships, since on the former at least Tulane is more generous than most. It would be interesting to see that stat for all schools (money actually paid), but I don’t think there is such a list.</p>
<p>OK, I looked at the methodology used and I have a question.
I must be missing something. Why does the graduation rate affect the overall net ROI? It seems to me there are two problems with this calculation. First of all why does someone else dropping out (more on that in a sec) affect my ROI? Presumably, if someone drops out of school without a degree, you could argue that their ROI is zero (try telling Bill Gates and Steve Jobs that) or not relevant to this survey. Or maybe the Bill Gates and Steve Jobs examples really show that once again a magazine is trying to measure something immesurable. But even if you take their premise that a college degree is solely linked to long term earnings as true for argument’s sake, and that must be their premise given the way they did this, I fail to see why people dropping out diminishes my ROI. It seems to me they should only be looking at people that did graduate and comparing those earning vs. costs between schools, not multiplying by the graduation rate. It would be different if the looked at the earnings of those who dropped out and figured out the difference, but they didn’t.</p>
<p>My second point is just because someone didn’t graduate in 6 years from School A doesn’t mean they didn’t graduate from School B. As I understand 6 year graduation rates, you not only count complete dropouts but also those who transfer. I think the only ones you are allowed to discount are people that join the military, are deceased or disabled to the point of having to drop out, joined some group like the Peace Corps, or went on a church mission. So even if I am missing something in my first point, why ding School A when the student in fact got a degree from School B? I mean if the student had attended School B from the start, and let’s say it cost the same, the students investment would have been the same. It just makes no sense to me to use 6 year graduation rates to “correct” the ROI.</p>
<p>Also that 50% figure they used as an example seems to imply they all left right before graduation or something. I mean even if they were all drop-outs and somehow using the 6 year rate makes sense, it can’t make sense to count someone that drops out after one year the same as someone that drops out after 5 years. But of course I still think it makes no sense to talk about my ROI being impacted by someone else dropping out anytime.</p>
<p>The study also failed to take into account geographical factors, where schools in the midwest and the south will, in general, have a higher proportion of graduates stay in those lower paying areas as compared to the Northeast, Mid-Atlantic, and California. But it is the use of the 6 year graduation rate that has me baffled. I must be missing something.</p>
<p>FC: The list gives figures for ROI overall (taking into consideration the graduation rate) and the ROI for graduates. The overall ROI is 748,000. The ROI for graduates is 1,011,00. </p>
<p>One thing (other than tuition) that jumps out at me is the graduation rate. I had no idea that only 74% of enrolled students actually graduate. More than 1 in every 4 enrolled students ends up dropping out or transferring? As a parent, I don’t like that stat at all. My son better not get any ideas!</p>
<p>LOL, he won’t. That figure is going up, partly because the Katrina effect is receding, and partly because Tulane is making stronger efforts in this area. Retention of freshmen has been rising as well. The latter will lead to an improvement in the former, as will the increased attention they are paying in both areas.</p>
<p>I understand about the ROI, they are taking the ROI of the graduates and multiplying by the graduation rate. But why would they use the latter number instead of the former? Isn’t the only thing that matters the earnings of the actual graduates? It still doesn’t make sense to me to count students that transferred in an overall ROI calculation. For example, I knew kids that transferred to Yale and Stanford (and other schools) after freshman and sophomore years. I know some of them went on to excellent careers, and I assume the others I didn’t track probably did also. What sense does it make to lower Tulane’s ROI because those students transferred to a more highly ranked school? There are a lot of flaws in trying to tie overall earnings to the institution where you got your degree anyway, but this just seems bizarre to me. I wish I could track down who did this and see that their reasoning was.</p>
<p>I am going to take the data this weekend and put it in a spreadsheet and see where it comes out if you only counted actual graduates. I’ll post my results.</p>
<p>This calculation is based on the Payscale “study”, which is not a credible source.</p>
<p>@fallenchemist/RisingChemist: Are you two related?</p>
<p>Which part is not reliable? The part that ranks Tulane no. 96 or the part about graduation rates?</p>
<p>^ Payscale relies on self-reported samples and fails to control for cost of living and field of study. It does not provide useful data.</p>
<p>The graduation rates and total cost don’t come from Payscale, they come from IPEDS which is pretty reliable since they use data provided by the schools themselves. So at least to the extent the schools are honest so is that data. From the Bloomberg report;
</p>
<p>
Technically, one wouldn’t “correct” for field of study. I mean, how would you even do that? Some professions pay better than others, sure, but you earn what you earn. However, where you live certainly does make a difference. That and many of the other criticisms are valid.</p>
<p>^ That’s true, but the entire point of the calculation is to measure ROI. If the salary figures are worthless, you aren’t going to get a meaningful result.</p>
<p>I am surprised that you guys take issue with the results. Personally, I was happy to see Tulane in the top 100. Considering the number of schools in the U.S., that is pretty good.</p>
<p>^ I don’t have a position on Tulane. My problem with the study would stand regardless of whether it scored #1 or #300.</p>
<p>@fallenchemist: Some majors tend to lead to more financially profitable careers than others. If one school has a higher proportion of engineering majors and another has lots of English majors, the former will probably have higher median salaries. However, an engineering major at the second school might get just as high of a salary and an English major at the first might not earn much at all.</p>
<p>Data on fields of study are available, so Payscale really should control for this.</p>
<p>OK, I see what you are saying now, and that makes sense. Does an engineering major from Georgia Tech typically earn more than an engineering major from Michigan, after controlling for geography and other correlative variables. At the same time, does a history major from Michigan typically earn more than the same from GT? That would be interesting to know and you are right, far more pertinent. That isn’t really correcting for field of study, it is drilling it down to another level of detail. Correcting for, as commonly used in studies, means adjusting a variable based on some factor, like correcting for inflation. But you are right that by lumping all the fields of study together, it only shows (again, ignoring the other flaws) what the average graduate earns regardless of major. A fairly useless result, I totally concur.</p>
<p>RisingChemist - ditto with noimagination. Tulane’s result is fine if one accepts the study at face value, and even better if one agrees that multiplying by the graduation rate makes zero sense. But it is like USNWR rankings, which I also vehemently disagree with on principle. Ranking #50 is fine as far as it goes, but Tulane could move to #27 in the next survey (where they rank based strictly on SAT scores, give or take a position) and I would still tell people to ignore it. Flaws are flaws, and just because they give me a result I like doesn’t make them correct.</p>
<p>I think the graduation rate is used because this ranking is purely from a financial point of view. It is from Business Week / Bloomberg. Look at it like a financial investment. If you invest in Tulane, you are likely to make $1,011,000 over the next 30 years as long as you graduate. However, there is a 26% chance that you will not graduate. Therefore, the reduction. It is a matter of odds and averages. Therefore, the projected return on the investment is 748,000 over that time period.</p>
<p>An analogy: If you invest in the tech market and learn that the average tech stock that remains solvent will provide a return of 8% anually. If 26% of the tech stocks go bankrupt during the year, you cannot say that the average overall return on tech stocks is 8% because 26% are a loss. To decide whether to invest in that market you should look at the whole picture not just the average profit margin of the successful companies. </p>
<p>A college education is a large financial investment for most people. The ROI ranking is based wholly on the likely profit margin or ROI. The value of the investment must consider all possibilities including the odds of the stock going bust (i.e. the student not graduating).</p>
<p>FC: I did what you suggested and ranked them based only on graduate earnings without regard to graduation rate or cost of attending. If they were ranked soley on graduates’ earnings over a 30 year period Tulane would rank 74th. Not too bad!</p>
<p>Hello again my friend. Hey, thanks for saving me the trouble. That at least takes on major flaw out of the study. A geographic correction should probably be next, along with noimagination’s correct observation.</p>
<p>
I agree that would be the case if two things are true: 1) that not graduating from Tulane means not graduating at all; and 2) That one corrects for someone that drops out after one year as opposed to someone that drops out after 5 years (yikes! What a horror that would be!).</p>
<p>With regard to #1, I could be totally wrong here, but my understanding of the 6 year graduation rate is that it takes your entering class, say the class entering in fall 2003, and it is the percentage of students that graduate from that same school by Spring graduation of 2010, taking out the exceptions I mentioned earlier like military service. If 80% of the students didn’t graduate from that school because they transferred to another school, you don’t get to correct for that number. Now I could be wrong about that, and if I am then fine, it helps make the number more pertinent to this study. But that is how I understand the number. So let’s say Tulane enrolls a class of 1500 for fall 2003, lose 100 of those to transfer and 50 to “flunking out” (breaking the law, whatever) or dropping out after freshman year, lose another 100 to transfer and 50 to flunking out/dropping out after sophomore year, and another 90 due to flunking/dropping out along the way over the next 4 years. If I am right about how these statistics get counted, those 200 students that transferred continued to make an investment until they graduated, most likely, and so counting that against Tulane makes zero sense, even from a purely financial point of view. Additionally, those other students left at different points of time, so their investment differed. That is not accounted for.</p>
<p>Just to bring it up again, this is just sticker price also. The actual out-of-pocket people experience varies widely at these schools from everything I have read on here. Besides everything else, that is a very serious flaw.</p>
<p>I like rankings like this that, though they certainly have disputable methodologies, at least use actual facts/figures to compute and not opinions. </p>
<p>The biggest problem for me with this one is that it does not include anyone with an advanced degree. </p>
<p>So all of those Tulane kids that go to med school and law school aren’t counted. All of those engineering kids at Ga Tech and MIT that go on to get a masters aren’t counted. Anyone with an MBA isn’t counted. Anyone that teaches at the university level isn’t counted. </p>
<p>In fact, the most impressive thing to me about this is that they found 1000 people from each of these schools with 30 years of experience that only have a bachelor’s. I still think it’s a neat list, but schools that tend to send kids on to professional and graduate schools are always at a great disadvantage.</p>
<p>
This logic is faulty because you assume that stocks and students fail on an entirely random basis. Investing in the market is presumably somewhat smarter than playing roulette because stocks are not randomly selected for failure and a prudent investor can hopefully identify safer picks. The same is true for colleges; students do not drop out when the magical Failure Fairy randomly sneaks up behind them and pokes them with her wand. I would guess that students academically near the bottom of the admitted pool as well as those in financial difficulties are far more likely to drop out.</p>