NYT: "Rank Colleges Right"

<p><a href="http://www.nytimes.com/2006/08/16/business/media/16leonhardt.html?_r=1&oref=slogin%5B/url%5D"&gt;http://www.nytimes.com/2006/08/16/business/media/16leonhardt.html?_r=1&oref=slogin&lt;/a&gt;&lt;/p>

<p>Well written article. Rehashes a lot of the same ideas we'e heard before, but still well-written and intuitive :)</p>

<p>Even though we have indeed heard it all before, the basic message bears repeating - especially in the midst of all the media hype surrounding college admissions these days. Even though we are constantly being reassured by educators, consumer education advocates, and even the very same admissions gurus that do the rankings that college selectivy is not a proxy for educational quality, the flurry of media hype surrounding the rankings this year is full of mixed messages. Truth of the matter is that a great many people do care about these rankings and whether or not college x,y, or z went up or down how ever many places. (If in doubt, just take a quick peek at related CC threads). Dub a list of popular and highly selective schools as "New Ivies" instead of an "academic shrine" or "buried treasure" or "hot pick" (all terms used in the past by Newsweek-Kaplan for basically the same list of top USNWR ranked HEI and it turns heads.</p>

<p>"Rank Colleges, but Rank Them Right" makes good food for thought.</p>

<p>
[quote]
Measuring how well students learn is incredibly difficult, but there are some worthy efforts being made. Researchers at Indiana University ask students around the country how they spend their time and how engaged they are in their education, while another group is measuring whether students become better writers and problem solvers during their college years.</p>

<p>As Mr. McPherson points out, all the yardsticks for universities have their drawbacks. Yet parents and students are clearly desperate for information. Without it, they turn to U.S. News, causing applications to jump at colleges that move up the ranking, even though some colleges that are highly ranked may not actually excel at making students smarter than they were upon arrival. To take one small example that’s highlighted in the current issue of Washington Monthly, Emory has an unimpressive graduation rate given the affluence and S.A.T. scores of its incoming freshmen.</p>

<p>When U.S. News started its ranking back in the 1980’s, universities released even less information about themselves than they do today. But the attention that the project received forced colleges to become a little more open. Imagine, then, what might happen if a big foundation or another magazine — or U.S. News — announced that it would rank schools based on how well they did on measures like the Indiana survey.</p>

<p>The elite universities would surely skip it, confident that they had nothing to gain, but there is a much larger group of colleges that can’t rest on a brand name. The ones that did well would be rewarded with applications from just the sort of students universities supposedly want — ones who are willing to keep an open mind and be persuaded by evidence.

[/quote]
</p>

<p>US News does a good job of ranking colleges and programs. They provide a valuable service to consumers of higher education. The Ivory Tower with its lofty ideals has an unsavory economic underbelly. Colleges are businesses with elaborate marketing machines designed to maximize revenues. They present an idealistic face to the public but the Finance and Budgeting Office really pulls the strings. US News is on the side of consumers who have to navigate their way through the higher ed enterprise. </p>

<p>The article mentions the argument that rankings should be based on performance, not reputation. Reputation IS an indication of performance, maybe the best single indicator available of performance. For the most part, reputations in higher ed are earned and must be maintained through continuous performance.</p>

<p>The article states that US News rankings are based mostly on reputation, which is misleading for another reason: the Peer Assessment factor is only 25% of the ranking.</p>

<p>The Indiana U approach of asking students how engaged they are in their education seems very soft and nebulous. Their goal of measuring whether colleges make students better writers and problem solvers strongly favors colleges with the worst students who have the most room for improvement.</p>

<p>The article mentions a criticism that students don't leave college much smarter than they were when they entered. This shouldn't come as a surprise. Intelligence is 50% genetic and 50% from how parents raise their children. What colleges do is refine the raw material with improved thinking skills and knowledge. </p>

<p>One of the most important things that a college does is attract and collect the best students and faculty it can. Colleges can't manufacture Nobel Prize winners like cars on an assembly line. But, they create an environment for growth by assembling the best student body and faculty it can. Many qualities about a college follow from selectivity, tradition, and culture. One of the most important qualities to consider about a college is that it attracts better students and faculty to begin with. Beyond that, the things the college does make only a small difference. If I had to put a number on it, I would say that the people (students and faculty) are 90% of what makes a college tick.</p>

<p>Collegehelp, I think you put your finger on one of the finer points of the NYT article. Exactly how does USNWR or any other ranking system put a number on not just what makes a college tick but exactly how do they determine what constitutes academic quality in terms of the "best students and the best faculty" - the dynamic interaction of the people that make up these HEIs. It is a recognized trend that colleges and universities are competing for students in a national, rather than local or regional, market these days. With this change in market structure ranking systems have not only gained popularity but now bear an increased burden of responsibility as parents and students look to them to provide a valuable public service. There is really nothing wrong with our so-called "national obsession" with ranking publications as long as the ranking themselves are taken with a barrel of salt. Even USNWR issues statements to the effect that prospective students should not give inordinate weight to the rankings when it comes time to choose the HEI they will apply to or attend.</p>

<p>While these ranking systems do provide an incredible amount of useful information for parents and students to assess HEI in their college search, these ranking systems are inherently flawed and inconsistent because the findings are based on largely arbitrary data inputs that does include the rubric of "academic reputation" as judged by peers. The data on academic reputation, which carries a heavy 25% can be manipulated (by PR and the media) which begs the question what is peer assessment of reputation based on at any given moment in time? Is it correlated with the caliber of the incoming freshman class based on SAT scores, of tenured faculty, of awards won, of published research, of endowed wealth etc. - all of these factors can be viewed to be subjective and flawed in some way or another just as much as any other factor on any ranking list.</p>

<p>
[quote]
Last week, in a report to the Education Department, a group called the Commission on the Future of Higher Education bluntly pointed out the economic dangers of these trends. “What we have learned over the last year makes clear that American higher education has become what, in the business world, would be called a mature enterprise: increasingly risk-averse, at times self-satisfied, and unduly expensive,” it said. “To meet the challenges of the 21st century, higher education must change from a system primarily based on reputation to one based on performance.”

[/quote]
</p>

<p>Wasn't the USNWR methodology severely altered after CalTech came in at #1 to the chagrin of HYPS? Tells you all you need to know.
To be fair, their ranking does give some measure of standardization across variables that is an improvement to what must have been available twenty years ago.</p>

<p>I would say SAT scores are the best objective piece of evidence of college quality. SATs are part of the US News formula. SAT scores are correlated with many other desireable qualities about a college, including peer assessment. SAT scores are hard data and almost impossible to manipulate. They are standardized, so any college can be compared with any other college on even terms. Of course, other things matter, but the other things are probably captured by SAT scores to some extent.</p>

<p>As an educator, I strongly disagree with the article's view of graduation rates. The attack on Emory (a school with which I am unconnected) is ridiculous.</p>

<p>Why should a school be judged by graduation rates? How are we to know if schools with near perfect grad rates are doing a good job or simply lowering the bar for passing?</p>

<p>Having studied and taught at selective universities my whole life, my opinion is that it is too easy to graduate from most schools. Although all good schools have various tough programs, most schools (that do not end in the words Institute of Technology) have majors and courses that are sufficiently leniently graded that the very weakest students can manage to receive a diploma. Remove some of these courses or raise the bar and it's likely that the school would improve AND the grad rate would fall.</p>

<p>The real problem is that there's no good measure of outcomes. If there were an advanced equivalent to the SAT that measured what was learned in college (say in writing and math) and which was required of all graduates, then employers would at least know which schools did and did not do a good job of educating their students.</p>

<p>As it stands, at many a top school with easy grading, it pays for bright but relatively (notice the word) less qualified applicants to hope to get in. In some cases it's better to graduate in the bottom half of an Elite with grade inflation than finish at an unknown school with great teachers and tough grading. With good outcome tests, unknown or newer schools with rigorous training could prove that their grads really know their stuff. Sadly, grade inflation is one of the ways top schools give their students an edge over less selective schools.</p>

<p>I know that tons of students at CC will hate me for saying this, but I really wish all the elites were designed so that more students who were on the border of being accepted (the marginal acceptees) and didn't work hard enough faced a high risk of flunking out.</p>

<p>I don't think the SAT is all that a fool-proof measure of intellectual achievement. It is designed to gauge potential to learn, and it has shown a strong statistical correlation with learning. However, I remember reading a New Yorker article some years back on the SAT, and how ambitious, resouceful Brooklyn parents got their kids to improve on their scores and be able to enter prestigious colleges. So while the SAT may indeed measure learning ability, it probably also measures the extent of involvement of parents in their child's education.</p>