Ten Reasons to Ignore the U.S. News Rankings

<p>

Nope.<br>
Now, do you have a substantive point to make that is not an ad hominem argument or one based on anecdotal evidence? :slight_smile: Let’s consider one made above:

</p>

<p>Instead of arguing from a few local observations, why not look at a more substantial set of data? For example, let’s compare Stanford in a match-up against UCSD:
[Compare</a> Colleges: Side-by-side college comparisons | Parchment - College admissions predictions.](<a href=“Compare Colleges: Side-by-side college comparisons | Parchment - College admissions predictions.”>Compare Colleges: Side-by-side college comparisons | Parchment - College admissions predictions.)
It turns out that 92% of students offered the choice between Stanford and UCSD choose Stanford. Similarly:
90% of students offered the choice between Stanford and UCLA choose Stanford.<br>
86% of students offered the choice between Stanford and Berkeley choose Stanford.
77% of students offered the choice between Stanford and UCSB choose Stanford.</p>

<p>(Stanford, by the way, has a 98% freshmen retention rate.)</p>

<p>i agree with those who have said the ranking can’t possibly tell the real quality of eduation of the knowledge gained by students, but they are mostly for the prestige of the school. and once students graduate and are looking for jobs, it would be nice for them to have a “prestigious” school on their record. because, lets be honest, the person hiring you won’t know that some really great professor works at such and such college with have a really great biochemistry department, because the average HR person doesn’t ■■■■■ every college website and journal like we all evidently do, so it is good to take the USNWR rankings into consideration and especially look at the more specific ranking lists, not as a be-all-end-all but just as a factor in your decision.</p>

<p>If prestige were all that important, kids from small LACs would rarely succeed in the job market, simply because most LACs are unheard of outside their home regions. But that is not the case. Also, in my experience, the name on one’s resume really only matters (if it does at all) when the applicant is finding his or her first job. After that, work experience and accomplishments are far more important.</p>

<p>I agree that in a few fields (investment banking being the most extreme example) prestige is a big factor. If my child were looking to start a career on Wall Street, I would likely steer him or her toward one of the Ivies (especially H/Y/P/D) that serve as feeder schools for the finance industry.</p>

<p>@tk: I am not surprised that a lot of students who are able choose Stanford over one of the California publics. My guess is that the socioeconomic makeup of those who do are primarily the extremes–kids whose families can easily pay the full tuition and those with virtually no resources. The “is it worth it?” question is much more of a factor for the large swath of families in the middle who have not demonstrated enough need to get financial aid but for whom attending an expensive private would force significant changes to their households (i.e., a parent taking a second job, borrowing against the house, tapping into retirement, and so on). If my family lived in California and we could afford to pay more than in-state tuition for a UC system school we would probably pick Stanford or another reputable private university or college in a heartbeat over one of the state schools, unless there were a particular program at one of the UCs that my child was interested in.</p>

<p>

It may be nice, but it’s largely moot for most U.S. college students, since somewhere around 0% of them attend a prestigious school (i.e., a few hundred thousand at state flagships).</p>

<p>

</p>

<p>It may be convenient, but it’s also not very smart. Why use an arbitrary amalgam of a bunch of weak proxy measures of school quality to target similar admission standards, when one could look directly at a sensible ordering of schools by the one metric actually of interest? It seems at best lazy and inaccurate. </p>

<p>I notice you completely skipped the question about school quality for a couple of example schools more than the 20 magical places apart in the rankings. Any particular reason?</p>

<p>

</p>

<p>This is a very bad analogy from multiple perspectives. The “product” of college education has been around at most of these schools for 100 years or more, so there is no need to evaluate it as if it’s not available. More importantly, colleges are not factories that turn out a product, and human interaction is the number one resource that goes into the education so it’s not quantifiable in the same way as a factory operation. Maybe the fact that you view colleges in these very different ways from me explains why you are perfectly satisfied with such a rotten means of evaluating colleges.</p>

<p>

</p>

<p>I don’t know where you get your information, but the common data sets for colleges have nothing to do with USNWR. Furthermore, there are much better means of understanding the information in the CDS than USNWR and without paying ridiculous subscription fees as well. Collegedata.com, for instance, summarizes the CDS for each school in a free, very accessible, non-opinionated form. There are also other free and open sites that allow searches by many of these parameters. But now we are to the heart of the matter, the true purpose of USNWR rankings is to make money, and one of the means for that is to draw attention to these for-pay services and sucker people into paying for things they don’t have to.</p>

<p>All of this talk of “mediocre” colleges is silly. Different colleges work for different people with different needs. Harvard may be great for the social climber who wants to work on wall street, while someone wanting to teach college might prefer Reed or UChicago.</p>

<p>

</p>

<p>I don’t think these comparisons are nearly as meaningful as some people think. That’s because, in choosing between a highly selective school (e.g., Stanford) and a somewhat less selective school (e.g., any of the the UCs), those who prefer the less selective school are, by and large, not going to bother to apply to the highly selective school. If UCLA is your dream school, why would you apply to Stanford? But if Stanford is your dream school, you might very well apply to UCLA as a back-up, thinking if you don’t get into Stanford, UCLA might make a reasonable second- (or third, or fourth, or tenth-) best fallback. So it stands to reason that the cross-admit pool is going to be heavily skewed toward those who prefer the more selective school from the outset. And that’s why, in virtually every case, the cross-admit "revealed preferences’ come out in favor of the more selective school.</p>

<p>Bottom line, then, the cross-admit comparison only tells you which school is more selective. Try it a few times, and tell me about any counter-examples you find.</p>

<p>I know in my own case, my absolute first choice was the University of Michigan. My HS GC encouraged me to think about applying to some highly selective private schools, and I knew I had the stats to be competitive. But I just wasn’t interested. So I only applied to one school, Michigan, I was accepted, I attended, and I never looked back. Most of the Michigan residents I met in college also had the University of Michigan as their number one choice. Many, but not all, of the OOS students had applied to more selective private schools and had not gotten in; if they had ended up in the cross-admit pool with one or more of those more selective alternatives, most would have chosen the more selective private school–but that’s because the more selective school was their preferred choice all along. People like me who preferred Michigan don’t choose more selective schools as their back-up. That would just be irrational.</p>

<p>And this is not just a public-private thing. My D1 had very good stats and could have been competitive almost anywhere. She really wanted a small, intimate LAC, and after looking at a bunch of them, she fell in love with Haverford. Now Haverford’s pretty selective, but there are LACs that are more selective. But since Haverford was her first choice, it made no sense to her to include more-selective LACs as back-ups. So her list of schools consisted almost entirely of schools that were either about as selective as Haverford, or less selective. As it turns out she applied to Haverford ED and was accepted, so she never even applied anywhere else. But someone whose number one choice is Swarthmore might easily decide to also apply to Haverford, reasoning that it’s slightly easier to get into Haverford than to Swarthmore and so it might make sense to apply to Haverford as a back-up, while students like my D1who prefer Haverford are by and large not going to apply to Swarthmore as a back-up. So the Swarthmore-Haverford cross-admit pool will consist mainly of people who prefer Swarthmore from the outset.</p>

<p>It’s just an obvious selection bias problem, if you think about it. So the cross-admit data mean virtually nothing, except which school is more selective.</p>

<p>

You’ve said this many times before but I’m not sure I buy it. What you say may be true for those students who are choosing between their in-state public and a more prestigious private school however it holds no water to students deciding between two private schools located in different states or even a public OOS and a private school.</p>

<p>I’m not sure how qualified you were as a HS students but most UMichigan or UIllinois or U of Wisconsin residents who went on to get a degree at their state flagships would have stood not shot at admission to a Northwestern or a Penn. They were above HS students who were hard working and dedicated students in the classroom but by no means were they geniuses. There are exceptions obviously but there’s a reason the most selective schools have much stronger students.</p>

<p>For a New York resident that is trying to decide between Wisconsin and Duke U. or U of Michigan and Stanford, the Parchment results more or less mirror their actual preferences–which lean heavily towards the more prestigious school.</p>

<p>You need to dissuade yourself of the notion that US News is “established” at ranking colleges or anything else. US News & World Report was a print magazine, one so bad that it ceased publication many years ago. </p>

<p>A researcher for that magazine, a person with absolutely no qualifications in the field of education, one year threw together a ranking of colleges that happened to become popular because people like to take easy, nonsensical shortcuts for the answers to difficult, complex questions.</p>

<p>That researcher has continued to develop more complex and amalgamated formulae to try to develop a list of colleges that satisfies his personal notion of a hierarchy of colleges and universities. His name is Robert Morse, and he attended U of Cincinnati and Michigan State. Why should his subjective formula be used to tell someone whether to attend Boston College or University of Virginia?</p>

<p>Rayshok, you may have heard the expression “consider the source.” USNWR offers a variety of information in its rankings compilations, much of which you might find useful. But remember that it’s a for-profit publication (notice the little padlock symbol that keeps you from seeing more details about each school unless you pay for a subscription?) and that it benefits from generating buzz about the lists (page views = advertising dollars).</p>

<p>Here are some not-for-profit sites my son used last year when he was researching colleges:</p>

<p>collegenavigator.gov</p>

<p>■■■■■■■■■■■■■■■■■■</p>

<p>We all really liked how ■■■■■■■■■■■■■■■■■■ allows the user to compare colleges side by side according to a wide range of factors (test scores, financial aid, etc.). It’s easy to use and sort by various categories. You should give it a try. The important thing is that you find ways to compare the colleges that fit YOUR criteria.</p>

<p>Rankings should be once source of information, without obsession over exact rank#.</p>

<br>

<br>

<p>USNews ceased regular print publication at the end of 2010. not “many years ago.”</p>

<p>

</p>

<p>[How</a> U.S. News Collects Its College Rankings Data: Common Data Set - US News and World Report](<a href=“http://www.usnews.com/education/best-colleges/articles/2012/09/11/how-us-news-collects-its-college-rankings-data-common-data-set]How”>http://www.usnews.com/education/best-colleges/articles/2012/09/11/how-us-news-collects-its-college-rankings-data-common-data-set)</p>

<p>I don’t have time just now to address all of BW’s other points - I have a day job - but I think it is important to understand that the USNWR rankings are based on data that isn’t just pulled out of thin air. You can disagree with the appropriateness of individual metrics, or point out that it is liable to corruption by colleges that report it inaccurately. You can dismiss individual metrics as “weak proxies”. I think we should be alert to all these issues. However, it seems to me that the individual metrics tend to be mutually corroborating. If you mix them up and play with the weights, you still come out with a similar set of top colleges (albeit in a somewhat different order). Forbes uses a very different set of metrics than USNWR, yet arrives at almost the exact same set of top 10 universities (again, in somewhat different order, and interleaved with LACs).</p>

<p>

</p>

<p>BW seems to think I am under some obligation to address each of his points in turn, and I missed this one above. The answer is: I really don’t know. I’m not too familiar with PS and not at all familiar with Clemson. They are two public universities in different states; I don’t imagine two such schools causing much cross-admit anguish. If I wanted to dig into this issue, I’d probably start with the USNWR site :slight_smile: or Google the individual CDS files for each one.</p>

<p>

</p>

<p>Well, they use the CDS as inputs, but you made it sound like you thought they generated and owned the CDS. Anyone can generate a crappy formula based on input information. The fact that a method uses factual information as an input in no way makes it a valid or reasonable way to evaluate educational quality. BTW, please point me to the “reputation survey” portion of the CDS?</p>

<p>

</p>

<p>There’s no other way to characterize the factors US News uses. Not one of them tells us anything directly about whether a kid learned anything or not. If you want them to be anything other than weak proxies, you have to dispute that, and of course you can’t.</p>

<p>The only one that remotely approaches telling us anything about learning is graduation rate, since we presume someone who graduates must have learned something. But graduation rate is a lousy measure of educational quality, because the primary reason students fail to graduate is economic, not educational. So rich schools with lots of rich students score well on this weak proxy measure because their students aren’t forced to drop out of school by finances.</p>

<p><a href=“http://www.prweb.com/releases/2012/2/prweb9165792.htm[/url]”>http://www.prweb.com/releases/2012/2/prweb9165792.htm&lt;/a&gt;&lt;/p&gt;

<p>

</p>

<p>Ranking colleges is itself an asinine activity. The fact that Forbes and USNWR agree at the tippy top and not elsewhere should be alarming to you rather than comforting.</p>

<p>

</p>

<p>Of course you don’t know. You stated categorically that there are “significant differences in quality” of education at the two schools, but you don’t actually know anything about them. All you know is that a formula created by an unqualified researcher from a failed magazine told you what to think, and so you do it.</p>

<p>Sheesh, bobwallace, then don’t use either the rankings or the data behind it. Who cares? Put down the magazine and walk away. No one is getting “hurt” by these rankings other than naive people who are too stupid to consider these rankings as general bands, not The Final Word. So let 'em. Someone who picks 9 over 17 who secretly prefers 17 but is convinced that 9 is Supremely Better Because The Rankings Said So - well, let em. Their problem. No one else’s.</p>

<p>

</p>

<p>Right. So let’s all agree that the top 10 are “too big to fail,” because of many factors–in many cases, their centuries-old history, notable graduates and faculty, mind-blowing endowments, and long-standing reputation. None of them is likely to EVER drop significantly in the rankings, because their prestige begets prestige-seeking (and highly qualified) applicants and will continue to do so. The fact is, these institutions were well-known to those of us who applied to college before the rankings madness began. Whether any of them is a good fit for an individual student is debatable, but I think we can agree that they are beyond reproach in a general sense (except perhaps for a humanities student considering Cal Tech).</p>

<p>Which brings me to BW’s point: what value, then, are these rankings for the 99% of colleges high school students have in their consideration set? And how should they try to evaluate the various rankings systems themselves before trying to evaluate colleges? We are trying to teach our kids critical thinking and analysis, right?</p>

<p>

</p>

<p>Huh? What I wrote was this:</p>

<p>

</p>

<p>That is hardly a categorical statement. </p>

<p>Much of the discussion on CC, and much of my personal experience (or family experience), is with schools closer to the top of the rankings. In my opinion, there were fairly big quality differences between the 2 top-5 universities I attended, on the one hand, and a 20-something university I also attended. There were very big quality differences between a top 5 LAC and a 20-something LAC-like “regional university” where I’ve also had personal experience. By quality differences, I’m talking about class sizes, the faculty quality, the level of engagement including the quality of class discussion, etc. The features measured by USNWR and other rankings seem to reflect and capture those differences fairly well. Although I would say too, that each of these schools met our needs when we needed them, and what we got out of them had a lot to do with what we put into them.</p>

<p>I’m open to persuasion that USNWR and other rankings are not very helpful in distinguishing the quality of fair-to-middling state universities ranked 20 (or more) places apart as you go deep into the rankings. I can find evidence, independent of the USNWR criteria, that 20-something universities like Berkeley, UCLA and Michigan may be academically stronger than 40-something or 60-something universities like PS and Clemson. I don’t know how to independently evaluate lower-ranked schools that don’t generate much discussion on CC, that don’t have a strong record of research production in many fields, or that don’t attract very many top students from out of state. For someone shopping in that space, I would say the in-state flagship usually is the one to beat; any quality differences reflected in a 20-place (or bigger) spread may not be enough to override the cost difference. </p>

<p>Recently, a poster asked for safety and match recommendations to add to a list that already included UMdCP, UVa, and UNC. He had very high stats and wanted to study biomedical engineering. I pointed out to him that the in-state UMCP (even though ranked lower overall than UVa or UNC) seemed to have stronger engineering programs than UNC or UVA, so it may not make sense to apply to those more expensive, more selective OOS public schools when he has a more affordable IS option, perhaps with better engineering. I also suggested (based on USNWR biomedical engineering rankings :)) that he have a look at Michigan if he’s really interested in an OOS public school with apparently strong programs in his intended field. I also suggested Duke, Johns Hopkins, or WUSTL (instead of the Ivies) if he wanted a “reach” school with well-regarded biomedical engineering.</p>

<p>That’s an example of how I use the rankings. Can you suggest a better approach to help a kid in a scenario like that identify safety-match-reach alternatives? What’s your method?</p>

<p>But Sally, I think that attention to these rankings is very much concentrated in a certain circle of people who care passionately about impressing the neighbors, and that the vast majority of Hs seniors don’t give them a second thought - as they pick on cost, distance from home, and perhaps specific major or program. Most students aren’t looking for “the best.” They are looking for “good (enough).” heck of a lot more kids choosing between (say) their state flagships and directionals compared to those agonizing over Harvard vs Dartmouth.</p>