US News rankings are meaningless-from mathematics student.

<p>

</p>

<p>To the credit of the OP, the above quoted magazine features probably rely on a better model and show more integrity in the process than what is used by Morse and his staff for the Peer Assessment index. If there was ever a contest for meaningless and manipulated processes, that PA might be the perennial favorite. </p>

<p>Unfortunately, the rest of the OP diatribe is pure drivel.</p>

<p>

</p>

<p>Of course. </p>

<p>What I’m suggesting is only that deltas of about 20 or more tend to signal differences that are likely to be significant to many students or parents. Lots of qualifiers are in that statement.</p>

<p>Yes, Duke v. Tufts (separated by the minimal span I suggested) is a counter-example. Although, selectivity is not captured solely by SAT scores. Duke’s F2011 admit rate was 14%, Tufts was 22%. Duke <em>is</em> a more selective school … but admittedly not by too much.</p>

<p>We could trade more examples and counter-examples.
Try #6 Stanford and #24 UVa, or # 27 Wake Forest.
Or compare #28 Tufts with #51 Boston U.
In these cases, a roughly 20-point spread does signal differences that I would expect some students and parents to care about. </p>

<p>However, I don’t disagree with bclintonk’s major points. By all means, if you care about certain numbers, look them up. I’m certainly not saying that a 20-position spread in the US News rankings is necessarily a solid basis for choosing between two schools. What I’m saying is that US News isn’t so meaningless that it can’t be useful for building an initial list of colleges within some manageable band (if not 20, maybe a little wider).</p>

<p>USNWR 1-10
Caltech Chicago Columbia Duke Harvard MIT Penn Princeton Stanford Yale</p>

<p>USNWR 21-30
Berkeley CMU Georgetown Michigan Tufts UCLA UNC USC UVa Wake</p>

<p>USNWR 41-50
Miami PSU RPI Texas UCSB UIUC Washington Wisconsin Yeshiva</p>

<p>No meaningful differences?
(with respect to, let’s say, an academically very strong first-gen, high-need student in a Florida public school with an uninformed/disengaged/overworked GC)</p>

<p>

</p>

<p>This info cannot be posted enough; it should be a sticky. Far too many students use this garbage as gospel and are making a huge life decision (and cranking up their stress) based on smoke and mirrors. I’d add to the mix Malcolm Gladwell’s take from a few years ago.</p>

<p>[What</a> College Rankings Really Tell Us : The New Yorker](<a href=“http://www.newyorker.com/reporting/2011/02/14/110214fa_fact_gladwell]What”>The Trouble with College Rankings | The New Yorker)</p>

<p>^ snarletron, what alternative approach would you recommend? To build an initial list of 20 or so possibilities, what do you consider a better source of information than USNWR (and the Common Data Set files that drive the rankings)? </p>

<p>Suppose a poster says, “I’d like to go to a LAC like Williams, but I don’t think I can get in there. How do I find good but less selective alternatives?”
Absent much info about the poster’s preferences, my typical suggestion would be to start with the USNWR “national LACs” in approximately the #21-40 range (+/- depending on stats). I’d recommend ignoring the ranking differences within that range (and not to take that range too literally). Or, I’d point to the “Colleges That Change Lives” list. If the poster had a specific list of criteria, I’d try to suggest a specific set of schools to fit those criteria.</p>

<p>An alternative (without referencing US News) is to start plugging preferences into an online college matcher. For example, one could plug these preferences into the College Confidential search tool:
Location = Midwest (very important)
Scores = 2000 (must have)
Tuition & Fees = up to $40K (very important)
School Size = very small, <= 2K (very important)</p>

<p>Results?<br>
Two come back with 100% match: Hillsdale College and Grinnell.
Other strong matches include:
Kenyon, Rose Hulman & Macalester (99%);
Bryn Mawr & Denison (98%);
Reed, Case Western, Ill Wesleyan, Colby, UIUC, Lafayette, Whitman, Gettysburg, Minnesota-TC, St. John’s-Annapolis, Wheaton, Scripps, Oberlin. </p>

<p>Actually, there is a lot of overlap between these results and my “USNWR #21-40 LACs” list.
Grinnell, Macalester, Bryn Mawr, Oberlin, and Kenyon are on both. However, in my opinion, the USNWR is a somewhat better list of “less selective alternatives to Williams” than the College Matcher search results. Moreover, from the listed schools on the USNWR site, one can drill down to much more information (mostly from CDS sources) on costs, demographics, etc. From the College Matcher, you can’t.</p>

<p>Then again, one could simply post the question to College Confidential and wait for all the suggestions to roll in (“Holy Cross is great for pre-med” from par72, “Go Buckeyes!” from Sparkeye7, my ramblings on class size and PhD production … :))</p>

<p>Id like to ask you do you think there is a meaningful difference between the #1 Schools, Harvard and Princeton and the #101 School Iowa Sate and the #199 Schools South Dakota State University, Louisiana Tech, East Carolina?</p>

<p>

</p>

<p>How many students really use it to make these decisions? When 90% of high school graduates either a. don’t go to college b. go to community college or c. go to a local state school it really can’t be all that many people.</p>

<p>I had not come near this thread since it was new because I didn’t think this particular dead horse needed any more flogging, but I do want to give coureur a shout out for post #20. Very funny!</p>

<p>^ Hey, people do need help with these big life decisions. Maybe there is a real market for something like the wife-picking algorithm in Along Came Poly.</p>

<br>

<br>

<p>Haha. Why would such “information” be a sticky? And what should replace it? The moronic and irrelevant graduate school rankings? The Forbes/Vedder CRAP? The Mother Teresa rankings? Which is the one that placed the almighty UT of El Paso a step above Harvard? Yes, a school that accepts everyone with a pulse over the second most selective school in the country! </p>

<p>In the meantime, the critics of the USNWR might learn HOW to use the report. The pure ranking is only what the naive considers and the idiot focuses to build an attack. The value of the USNWR is found in the underlying data. Safe and except the cronyistic and manipulated Peer Assessment, the tables offer a wealth of easy to compare data at a cheap price.</p>

<p>Think where we would be without the efforts made by the USNWR to collate the data and force the schools to adopt the Common Data Set and understand the value and need to release information on admissions. If left to the schools only, we’d have nothing but advertising and falsehoods.</p>

<p>As far as Malcom goes, he should stick to subjects he knows. Not rankings.</p>

<p>Xiggi wrote:

</p>

<p>LOL. The irony is pretty rich. I should think after 15,000 posts (and counting) the answer would be obvious: no one does a better job of delivering more nuanced information to the general public than College Confidential itself. And for die-hards who really cannot do without a “ranking” of sorts, the College and University sections do a credible job of identifying the top 50 or so heavy-hitters.</p>

<p>

</p>

<p>The problem is trying to rank complex cultural institutions with precision. The Princeton Review student surveys are helpful, since they rank specific aspects of colleges. USNWR is a good source book but the rankings turn it into a pseudo Consumer Reports. Then many student and parents fixate on getting admitted into the highest-ranked school instead of finding the right school for the kid. You hear it on CC all the time “Here are my stats. What’s the best school I can get into?” When I applied to college a few decades ago one word I didn’t hear once from anyone was “prestige.” Now it is a word that I wish would disappear from the dictionary. The appearance of the USNWR rankings in 1983 was the start of this kind of thinking. One can’t really rank colleges any more than one could rank countries. You might say that Italian food is better than British food, or that German cars are better than French cars, but it would be hard to justify putting the entire country of Italy at #4 and Japan at #6.</p>

<p>

</p>

<p>Even the “cronyistic and manipulated” Peer Assessment offers value to some of us simpletons. :D</p>

<p>

</p>

<p>We can (and do) assess whole countries according to specific characteristics such as crime rates, infant mortality, per capita income, literacy rates, etc. It isn’t entirely far-fetched to rank countries according to a set of criteria that suggest “quality of life” or “standard of living”. Can we say which North American and Northern European countries (or any other countries) are the “best” countries? Of course not. Can we say flatly, precisely which one has the #1 quality of life? Not really (because that is a matter of definition). However, we can state with high confidence without even measuring anything that Sweden has a better quality of life right now than Syria does. We also can say after measuring a bunch of things that Sweden, Finland, and Denmark (for example) appear to have even better qualities of life (in certain respects at least) than some other countries that also are pretty good places to live.</p>

<p>

</p>

<p>Whether you realize it or not, that’s part of the rap against USNews: the lingering feeling - which Malcolm Gladwell mentions in his article - that it trims its data in order to mimic conventional wisdom. In that sense, it’s not much more than glorified Excel™ sheet.</p>

<p>

</p>

<p>Yes, absolutely, just as we can state with high confidence without even measuring anything that Duke is a better university than unknown directional state U in the middle of nowhere, established in 1962 with open admissions. The problem is making precise gradations between Duke and Northwestern, or between Swarthmore and Carleton. Or even implying that one cannot have as good of an educational experience at Muhlenberg than at Amherst.</p>

<p>

</p>

<p>Over the years, Morse has solicited input from educators and others regarding his weightings. (Alumni giving just didn’t appear; a bunch of folks sat around a room and thought about ways to measure satisfaction. Of course, Alumni giving also favors colleges with wealthy student bodies, as do most of the weighted items measured in the ranking.) </p>

<p>Perhaps it mimics conventional wisdom since that same wisdom had input on the design?</p>

<p>The correlation between US News rank and graduation rate among the top universities and LACs is -.90 (very high inverse relationship) so the rankings are definitely not meaningless mathematically. If prospective freshmen don’t look at these rankings they are making a mistake. US News is on the side of consumers of higher education. They want to help people make more rational decisions. The colleges and universities, especially lower tier, want to use smoke and mirrors.</p>

<p>Yes, it is important to think critically and put the ratings in perspective but it would be foolish to disregard them. College choices are more complex than what the rankings can capture, so you might have to weigh other considerations when making your decision.</p>

<p>

</p>

<p>Wait, I don’t get it. Your argument seems to be that US News rankings have value because they correlate with graduation rates. OK, I’ll bite, graduation rates are important. But then why not look directly at graduation rates? After all, nothing correlates better with graduation rates than . . . graduation rates.</p>

<p>Here’s how the top 50 research universities would look, ranked by 6-year graduation rates.</p>

<p>1.[tie] Harvard, Yale 97
3. [tie] Princeton, Stanford, Columbia, Penn, Notre Dame 96
8. [tie] Brown, Dartmouth 95
10. [tie] Duke, Northwestern, Georgetown, UVA 94
14. [tie] MIT, Wash U, Cornell 93
17. [tie] Chicago, Johns Hopkins, Rice, Vanderbilt 92
21. [tie[ Boston College, Brandeis, William & Mary 91
24. [tie] Emory, UC Berkeley, UCLA, USC, Michigan, UNC Chapel Hill 90
30. Wake Forest 88
31. [tie] Caltech, Carnegie Mellon, Lehigh, Penn State 87
35. [tie] NYU, UC Davis, UC Santa Barbara 86
38. [tie] UCSD, UC Irvine, Yeshiva, Boston U 85
43. [tie] RPI, Florida 84
45. [tie] U Rochester, Wisconsin, UConn, Georgia 83
49. [tie] Illinois, U Maryland, Virginia Tech 82</p>

<p>Falling out of top 50: Georgia Tech 79, Case Western 78, U Miami 78, U Texas 81, U Washington 80. </p>

<p>Pretty bunched up at the top, really, from #1 to #24; fine gradations of difference from one step to the next, but a definite difference between #1 and #24. OK, I can see some value in that. But to me it suggests there’s much less difference along the top couple of dozen (or so) schools than all the agonizing over Bob Morse’s ordinal rankings is worth.</p>

<p>Now I would never suggest that anyone rely solely on graduation rates to choose a college, but it is an interesting and relevant criterion, and I would be somewhat skeptical of schools that don’t perform well on this measure (adjusting, perhaps, for things like engineering programs and co-op programs which tend to depress graduation rates). And it’s pretty clear most schools–the vast majority of them–perform quite poorly on this criterion. Most public flagships fall well below 80%, for example, but the top 5–UC Berkeley, UCLA, UVA, Michigan, and UNC Chapel Hill–stand head and shoulders above the rest.</p>

<p>But I don’t need to rely on the overall US News ranking as a flawed proxy for graduation rates. I can just look at the graduation rates themselves, which are much more revealing, not to mention more accurate than some half-a**ed proxy ranking.</p>

<p>

</p>

<p>Graduation rates are correlated with entrance selectivity (likely with plenty of causation – and entrance selectivity affects US News rankings). Academically stronger frosh are more likely to be able to handle full loads of college level work, and less likely to flunk out, or take extended time due to repeating failed courses or needing remedial courses.</p>

<p>It is likely that, for a given individual student considering schools with different graduation rates with correspondingly different levels of selectivity, the individual student’s risk of not graduating or graduating later than planned does not vary as much as the differences in the schools’ graduation rates.</p>