What's the Value-Added of USNWR Freshman Retention Rate?

<p>What is it about Arizona and Tennessee that they can't even keep 8 out 10 students?</p>

<p>hawkette</p>

<p>
[quote]
it would probably not be too hard to classify programs as 5-year and measure those students separately. For financial purposes, 4-year grad rates have real value to students & families.

[/quote]
</p>

<p>You could do that, but then you have to come up with some more complicated summary metric: 150% of time planned for each degree program. </p>

<p>But people change majors. Some places tell a student who gets through sophomore year and wants to switch to engineering "okay, but you cannot get your courses done in the next 2 years, you are looking at a 5 year program". Other places say "sorry, you cannot switch to that major because you will not get your courses done in 4 years. If you are tired of physics, pick something else, but you have two years left". This would keep the 4 year grad rate high, but I am not sure the second response is better. </p>

<p>It is much simpler to stick with 6 year rates. </p>

<p>Financial implications depend on why someone takes more than 4 years. If they are out of school, and not paying tuition, then there is no financial impact. If they are in a 5 year program, then there is no reason to expect graduation faster. If they switch from a 4 year to a 5 year program, they know they are doing that. Again, why penalize the college for offering these choices?</p>

<p>I think we are agreeing that 4% to freshman retention makes no sense. I am just pointing out that you have demonstrated that removing it would have nearly no effect on the rankings. This is both because it is only worth 4%, and because the schools at the top of the ranking have high retention rates anyway.</p>

<p>In other words, it is arbitrary, but removing it would not change the ranking.</p>

<p>Who cares about USNews rankings? They don't mean anything.</p>

<p>afan,
It DOES affect the rankings. In fact, the rankings MAGNIFY the impact of very, very small differences in freshman retention. That was/is my point all along. For example, U Chicago scores at 96% which is good for 18th place. but Northwestern scores at 97% which is good for ten spots higher at 8th place. Or U Wisconsin and many others are at 93% which is good for 40th place. Small differences in retention numbers, yet huge differences in ranking points and thus score. </p>

<p>As for rankings, while you may not like them, I think you understand that some people find them (and their data) useful and they do play an influential role at different stages of many students' college searches.</p>

<p>U Chicago is 18th place on this metric, but what effect does that have on its mystical overall ranking? I suspect it is close to zero (can't be otherwise, since 4% is close to zero).</p>

<p>U Chicago is actually a good example. It is tied for 9th in the overall USNWR ranking with Dartmouth and Columbia. Each school has 89 points in the tabulation of all of their results. U Chicago had 96% retention (18th place), Dartmouth had 97% (8th place) and Columbia had 98% (1st). It is quite possible, maybe even likely, that the 4% weighting of this factor is what kept Ivy Leaguers Dartmouth and Columbia tied with U Chicago.</p>

<p>what an excellent point!</p>

<p>In calculating graduation rates for US News and most other sources, students in 4-year programs are allowed 6 years to graduate and students in 5-year programs (e.g. engineering) are allowed 7.5 years to graduate.</p>

<p>
[quote]
In calculating graduation rates for US News and most other sources, students in 4-year programs are allowed 6 years to graduate and students in 5-year programs (e.g. engineering) are allowed 7.5 years to graduate.

[/quote]
</p>

<p>In my experience with data reporting, 6 years is 6 years, and it includes students regardless of program.</p>

<p>hoedown-
In the US Department of Education IPEDS Graduation Rate Survey, the benchmark for all graduation rate reporting, there is a supplement for 5-year programs' graduation rate after 7.5 years. I think it is a federal law. And, associates programs are only allowed 3 years to graduate.</p>

<p>So under the current system U Chicago is tied for 9th with two other colleges. If one eliminates freshman retention, then, depending on rounding I gather, it would either remain tied for 9th, it would go to sole possession of 9th, or perhaps be tied with one other college instead of two. This is the difference you are worried about? </p>

<p>How about the fact that ANY weight given to ANY of the criteria is completely arbitrary? Someone just made it up. Literally picked at random. Based on nothing at all. No outcome data, no evidence that these criteria predict the quality of undergraduate experience, let alone that 4, 40, or 100% is an appropriate weight.</p>

<p>
[quote]
In calculating graduation rates for US News and most other sources, students in 4-year programs are allowed 6 years to graduate and students in 5-year programs (e.g. engineering) are allowed 7.5 years to graduate.

[/quote]
</p>

<p>Not exactly. </p>

<p><a href="http://nces.ed.gov/ipeds/web2000/viewinstruction.asp?instID=164%5B/url%5D"&gt;http://nces.ed.gov/ipeds/web2000/viewinstruction.asp?instID=164&lt;/a&gt;&lt;/p>

<p>They do collect this information, but if you work through the 6 year graduation rate calculation, it does not permit one to average in a 7.5 year grad rate for those in 5-year programs. </p>

<p><strong>SPECIAL INSTRUCTIONS FOR INSTITUTIONS WITH 5-YEAR PROGRAMS</strong>
Institutions with 5-year undergraduate programs are to report on the same cohort of students that is being reported by the traditional 4-year institutions. Section II, column 46 requests the number of students still enrolled in 5-year programs. Be sure to complete the information requested in this item. NCES will also request that institutions with 5-year programs report data and calculate a graduation rate after 7-1/2 years. A special supplementary form will be used in Spring 2009 to collect this information on your 2000 cohort.</p>

<p>So you might find this additional information somewhere in a cds if the college fills out this section, but the 6-year rate would still penalize places with 5-year programs. </p>

<p>Besides, what is so bad about taking time off during college if it is put to good use?</p>

<p>afan,
Argue if you wish about the usefulness (or lack thereof in your view) of the rankings and the wealth of data that they provide. And, of course you are right that the weightings are arbitrary. I have long argued the same. But if rankings are going to happen anyway (and I do accept that they are not going away and that they play a role-often an important one-in the college search process), then I would prefer weights that more closely reflect the importance of what is being measured.</p>

<p>
[quote]
In the US Department of Education IPEDS Graduation Rate Survey, the benchmark for all graduation rate reporting, there is a supplement for 5-year programs' graduation rate after 7.5 years. I think it is a federal law. And, associates programs are only allowed 3 years to graduate.

[/quote]
</p>

<p>Outside of this IPEDS 150% definition, my institution calculates and reports graduation rate on a complete cohort of students regardless of program. This is what is reported in our Common Data Set and to USNews.</p>

<p>hawkette:</p>

<p>small nit: while you and I may not agree with the weightings, I strongly beleive that they are not 'arbitrary.' USNews has been publishing this issue for nearly 20 years now, and every year they receive a tremendous amount of feedback, both from the public and academe. And, based on the feedback (and their own interest in selling magazines), USNews tweaks the weightings every so often. While their method might not be scientifically sound, I would hardly call it arbitrary. :)</p>

<p>Blue, </p>

<p>Hawkette hit the nail on the head:</p>

<p>
[quote]
then I would prefer weights that more closely reflect the importance of what is being measured.

[/quote]
</p>

<p>The problem is that no one knows what these weights might be. </p>

<p>No one can even agree on what should be measured. Yes, USNews gets lots of feedback, but until they can present a reason for which factors are included, and the weights used, then they remain arbitrary. </p>

<p>Why are these factors included at all? Why not mean MCAT, LSAT, GRE scores of departing students? Why not job placement, later life success, income, professional accomplishment, or other outcome measures? Why not average debt of graduates and loan default rates? Easy to argue that all of the above are more important than the factors they use.</p>

<p>Debating about weights implies there is some standard to apply in selecting what weights, including zero, to assign to a factor. Where is the data for this standard? Absent that, the methodology fits even the most narrow definition of arbitrary. </p>

<p>All that said, I remain convinced that a re ranking of the schools dropping the retention rate would generate a nearly identical list, and therefore that it does not matter whether the retention rate is included or not.</p>

<p>
[quote]
All that said, I remain convinced that a re ranking of the schools dropping the retention rate would generate a nearly identical list, and therefore that it does not matter whether the retention rate is included or not.

[/quote]
</p>

<p>That's not true. For several years, CalTech dropped way down in the rankings. The President of Stanford pointed out the flaw in USNEWS rankings that caused this precipitous drop. In addition to graduation rate, USNEWS had also started including performance against "projected grad rate". The projected grad rate is based principally on the incoming SAT scores -- higher scores means higher projections. The year this measure was added, CalTech plummetted. The combination of the highest SAT scores in the country and one of the most demanding academic programs guaranteed that CalTech would miss its 98% (or whatever) projected grad rate. As the Stanford Pres pondered in his open letter to USNEWS, do we really want to suggest that CalTech would be a "better" or "higher ranked" university if it replaced its demanding curriculum with basketweaving courses?</p>

<p>USNEWS still uses the "projected grad rate" measurement, but makes a manual adjustment so that CalTech moves back up where it "belongs". However, any notably demanding academic school with high SATs gets dinged by this measure.</p>

<p>It's also important to note with the grad rate measures that transfers INTO a college are not counted. Thus, colleges get dinged for students who transfer out, but do not benefit in the rankings from the offsetting transfers in. This is obviously a major factor at places like the UCs that are designed to accept significant numbers of community college transfers.</p>

<p>afan:</p>

<p>I disagree. I think the weights do reflect what is being measured. And, those weights and items measured are easily found in the details of USNews' reports. Again, we may disagree with what is being measured, and the weights assigned to them, but personally I do not see that they are 'arbitrary'.</p>

<p>In doing any type of analysis, something has to be measureable and meaningful, and I think USNews does that. Perhaps their measurements are not meaningful to you (suggestions in your post #35) or me (I think alumni giving is just a way to ensure that the old line, blue blood colleges are well represented in the top tier) or Hawkette (who suggests adding employer rankings), but they are meaningful to at least the editors of USNews and thier readers.</p>

<p>Is it good social science, of course not. Does it sell magazines, you bet.</p>

<p>I agree with post #16. The criteria for USNews rankings are strategically compiled so that HYPS always ends up on top. Why do you think Peer Assessment is such a big part? Because the highest quality applicants (like everyone else) read USNews and see that all the best applicants from the prior year ended up at HYPS, so they should go to HYPS in order to be surrounded by smart people. It's a vicious cycle. 1% differences in retention rate serve the same goal: make sure HYPS ends up on top.</p>

<p>
[quote]
USNEWS still uses the "projected grad rate" measurement, but makes a manual adjustment so that CalTech moves back up where it "belongs". However, any notably demanding academic school with high SATs gets dinged by this measure.

[/quote]
</p>

<p>If this is true - i.e. this "manual adjustment" then we should really all throw the USNWR rankings into the trash bin.</p>

<p>Its complete rubbish. When they can (and do) manipulate the numbers and rankings to their liking, what is the point? Getting force-fed a list of pseudo rankings based on flawed methodology?</p>

<p>To add to this, the Peer Assessment score is also a significant factor with huge manipulation vulnerabilities. Who is to say which school got what? Where is the backup data? No one knows. It could all be made up at the end of they day. Sure they send out the forms to fill out, but who checks those forms? Do they all get recorded and measured 100%? Or are there some missing hanging chads floating around? If one part of the rankings gets a "manual adjustment" who is to say that other numbers don't get the same "manual adjustment"?</p>

<p>It looks like everyone here has been getting a major "manual adjustment" from USNWR.</p>

<p>interesteddad-
I don't think US News makes any manual adjustments of any kind to their graduation rate projection formula. I don't think they override any part of their formula? Where did you hear this?</p>

<p>The US News graduation rate prediction formula is based on SAT scores, high school rank, and expenditures per student. Surprisingly, expenditures per student is negatively correlated with graduation rates. Why? I think it is because the schools with engineering and comp sci have the highest expenditures per student and they also have the most difficult curricula.</p>

<p>So, there is an adjustment for places like Caltech with high expenditures per student.</p>

<p>Caltech is demanding but they also used to do a poor job of educating undergraduates. Students left Caltech not because it was hard but because it was bad. In 1990, the graduation rate at Caltech was 75%. Now it is 90%. There evidently was a lot of room for improvement at Caltech. Did Caltech become a lot easier over the last 17 years? I don't think so. Caltech became better at educating undergraduates. US News had it right when it gave Caltech a lower rank. They deserved it. In the past, MIT graduation rates have been higher than Caltech by 10-12%. Now MIT and Caltech grad rates are closer.</p>

<p>The former President of Stanford criticized the US News rankings many years ago but he did not understand what he was talking about.</p>