<p>Thanks. That's a good article summarizing some of the pitfalls of the USNEWS ranking system.</p>
<p>Good article, well-written, objective. CayugaRed2005 happens to know Ehrenberg, the author.</p>
<p>^^ I agree that it is well-written, but it certainly is not "objective." Indeed, many of the author's ideas are intuitive and have been discussed ad nauseum on cc, but the author provides little support for his speculations. For example:</p>
<p>
[quote]
Hard data on the cost of such PR actions does not exist, but one must wonder whether the resources...
[/quote]
</p>
<p>Wondering is nice, but it doesn't meet the null set hypothesis test for a conclusion. Or, </p>
<p>
[quote]
use of the top 10% criteria may influence who institutions admit at the margin
[/quote]
</p>
<p>Yeah it may, and it may not...where's the proof, particularly since, as the author notes, more and more high schools are no longer ranking kids.</p>
<p>
[quote]
real problem is USNWR’s arbitrary assignment of weights..
[/quote]
</p>
<p>The author even admits that USNews has a panel of educator advisors that tweaks the weights in order to improve them, which, IMO is not arbitary at all. </p>
<p>
[quote]
Cut backs in state appropriations have led to tuitions to rise at many of these institutions.....At the same time, the institutions are increasingly pouring money into merit scholarships to attract high test-score students, leaving fewer funds available for institutional need-based financial aid.
[quote]
</p>
<p>Perhaps true in some states (but not in Calif), but I would like to see the data of "pouring money..." Yes, certain public colleges do chase NMSF's, but they have been chasing them for years. To really make this point, one would need to see if a correlation exists between the merit aid policies at those publics that chase NMSFs and USNews rankings.</p>
<p>
[quote]
The author even admits that USNews has a panel of educator advisors(sic) that tweaks the weights in order to improve them, which, IMO is not arbitary(sic) at all.
[/quote]
</p>
<p>Really, so is there any empirical basis that a school's peer assessment (to use but one example) amounts to 25 percent of the quality of an undergraduate education that a student receives?</p>
<p>I'm not going to argue every single point with you, as I'm a bit close to the item at hand. But Ehrenberg's piece is really a discussion paper, and will obviously be with a bit of bias and subjectivity. That said, it's still a high water mark for academics empirically addressing the issue of the U.S. News rankings.</p>
<p>
[quote]
Indeed, many of the author's ideas are intuitive and have been discussed ad nauseum on cc, but the author provides little support for his speculations.
[/quote]
</p>
<p>I was about to write the same thing. I agree with many of the writer's viewpoints but he shows little statistical evidence that US News is forcing colleges to game the system, so to speak. I think one of the reasons he neglects to use real numbers is that he doesn't want to attack specific institutions (<em>cough</em> WashU <em>cough</em>).</p>
<p>Isn't it like using only SAT scores to judge a student? There are so many factors involved in making a college decision. The USNWR ranking should just be one of them. More important is a good "fit" for the student in terms of what the student is looking for and what the overall cost will be.</p>
<p>It would be very difficult for USNWR to practice "holistic" evaluations...</p>
<p>Also, instead of using incoming scores ... Why not try to rank on the basis of outgoing scores?</p>
<p>I know Ehrenberg well myself. He is a very interesting man.</p>
<p>It might not work because they've already slowed their on-line edition to a crawl, but...</p>
<p>What would be really great would be for USNEWS to add a few more columns (diversity, percent frat, binge drinking rate, PhD production, etc.) and then let you customize the percentage contibution to the rank. </p>
<p>Value diversity more than estimated grad rate? No problem. Give diversity a 10% weight and zero out the estimate grad rate. Let the on-line database calculate a new ranking.</p>
<p>The twist that caught me most in the article was the effect of transfers for a place like SUNY. Transfers are one of the large gaping holes in how USNWR does its evaluations (except for freshmen retention which I think is a biased number). The numbers for transfers are not included in the calculations of student strength and graduation statistics and there is potential for conflict here between how an institution treats its transfers and how it treats its full-time, first-time freshmen.</p>
<p>He makes mostly solid arguments, like a newer version of Gerhard Casper's criticism of US News:</p>
<p>Criticism</a> of College Rankings - September 23, 1996</p>
<p>The errors bug me though ("who institutions admit," "Why American's Have...", etc.).</p>
<p>The USNWR rankings are fine as long as people realize that not every school is going to be a fit for everyone. For instance, U of Chicago is ranked really high, but I think most college applicants would be unhappy there.</p>
<p>Regarding objectivity, consider that he works for a university that looks pretty good in the US News rankings and is making statements in defense of competitor public universities.</p>
<p>^ That surprised me too...he's a faculty member of a top private university discussing concern for public universities losing their "public mission" if they fall into the trap of trying to play by USNWR ranking criteria...</p>
<p>^^ even more strange was when the president of Stanford praised Berkeley (and Michigan) as much as he did. =p</p>
<p>Maybe the time has come to have separate rankings, with different criteria, for private universities and public universities.</p>
<p>There is no way you can ever have a ranking system that is accurate for the majority of the population. Never can happen.</p>
<p>It is pretty amazing to me that some people value incoming freshmen's test scores over value added to students and society.</p>
<p>There is no need to separate the public and private schools. Using the changes the writer suggests,and choosing the correct percentages, or formulas, my alma mater would surge to the top of the rankings. It's all in the percentages. :)</p>
<p>From the paper...</p>
<p>"The problem with the USNWR rankings lies not in its presentation of the information
on individual data elements, but in its effort to aggregate these elements into a single
index. If it stopped doing this, many of the objections that people have about its ratings
would go away. Of course, so too would the rankings;"</p>
<p>LOL</p>
<p>the author continues....</p>
<p>"the annual USNWR college issue
would begin to look more and more like other college guides.
The rankings exacerbate, but are not the major cause of the increased competition in
American higher education that has taken place over the last few decades. The real shame
is that this competition has focused institutions on improving the selectivity of their
entering first-year classes. Institutions appear to be increasingly valued for the test scores
of the students they attract, not for their value added to their students and to society.
This problem appears to be particularly acute for our public higher education
institutions at which the vast majority of American college students are educated. Cut
backs in state appropriations have led to tuitions to rise at many of these institutions.</p>
<p>As far back as 1986, I expressed the concern that the use of average faculty salaries in the faculty
resource category penalized institutions located in low cost-of-living areas that did not have to offer high
salaries to attract high quality faculty. USNWR quickly responded to my concern by deflating an
institution’s average faculty salaries by an area cost-of-living index and using this measure in its ratings
formula.</p>
<p>At the same time, the institutions are increasingly pouring money into merit scholarships to
attract high test-score students, leaving fewer funds available for institutional need-based
financial aid. More and more students from low-income families find that attendance at
two-year public institutions is the only way that they can begin their higher education
careers.</p>
<p>The public 4-year institutions need to remember their responsibilities to provide
access to a broad range of citizens of their states. They and their private counterparts also
need to do a better job of facilitating the transfer of students from 2-year institutions and
of improving the academic success rates of students who do transfer to them.</p>
<p>USNWR could contribute to helping these things occur by incorporating additional
data elements into its rankings methodology. Public institutions (at the least) should be
given “credit” for enrolling (and graduating) students from lower-income and
disadvantaged backgrounds. Given the large and growing importance of transfer student
enrollments at most institutions, institutions should be required to provide information on
transfer student success that is analogous to the 6-year graduation rate data for freshman
and the two success rates weighted by the proportions of new students that enroll in each
category to help judge how well an institution is performing on this dimension."</p>
<p>Note to Hawkette:</p>
<p>"high quality publics--Berkeley, Michigan, North Carolina, and Wisconsin".</p>
<p>
[quote]
my alma mater would surge to the top of the rankings...
[/quote]
</p>
<p>Actually, if one goes back to the first rankings, several publics were highly rated, including positioned in the top 10. Of course, the editors quickly realized that such a list is not good for magazine sales, particularly in the NE (which has a paucity of high-ranked publics and, of course, the bluest of blue blood colleges). Presto -- change the criteria, and the publics dropped like a rock. Correlation....causation...hmmmmm.</p>