Washington Monthly College Rankings Out

<p>The Washington Monthly released its college rankings this week, an event that will go virtually unnoticed in all the hoopla surrounding U. S. News's.</p>

<p><a href="http://www.washingtonmonthly.com/features/2007/0709.rankings.html%5B/url%5D"&gt;http://www.washingtonmonthly.com/features/2007/0709.rankings.html&lt;/a&gt;&lt;/p>

<p>Their rankings are quirky and sui generis. They attempt to rate colleges' contributions to social mobility; overall research; and committment to service. The ranking numbers themselves don't seem very useful, and some of the data seems like it wasn't checked very carefully, but the individual data points can be interesting.</p>

<p>Washington Monthly is one of my favorite publications. It has a relatively small circulation as compared with USN&WR and appeals to a different segment of the population. These rankings may not be meaningful to many, but serve their own purpose. Unlike USN&WR, the magazine invites you to "kvetch" about its methodology!</p>

<p>"BACK TO SCHOOL....U.S. News & World Report publishes its university rankings every year, and every year people complain about them. So starting in 2005 we decided to do more than just complain, and instead came out with our own rankings — based not on reputation or endowment size, but rather on how much of a contribution each university actually makes to the country. This year's #1 school? Texas A&M. Washington Monthly editor Paul Glastris explains:</p>

<p>Surely, you might ask, we don't really think that Texas A&M is better than Princeton? Well, yes, in a way. Remember, we aren't trying, as U.S. News does, to rate how selective or academically prestigious a given school is, but rather how much it contributes to the common good. The whole point is to recognize the broader role colleges and universities play in our national life and to reward those institutions that best fulfill that role. After all, almost every major challenge America now faces — from stagnant wages to the lack of fluent Arab speakers in the federal government — could be met in part by better harnessing the power of our colleges and universities.</p>

<p>So instead of measuring, say, the average SAT scores of incoming freshmen, or the percentage of alumni who donate money, we rank colleges based on three criteria: social mobility, research, and service. In other words, is the school recruiting and graduating low-income students? Is it producing PhDs and cutting-edge research? And is it encouraging in its students an ethic of service? By this yardstick, Texas A&M really does outperform every other university in America (a nose ahead of UCLA and UC Berkeley)."</p>

<p>Well, I guess it's nice to see my alma mater, UC Davis, make the top 10 in some ranking besides vet schools. But somehow I don't think this annual list is ever going generate quite the buzz that USNWR does.</p>

<p>I'm sure that there is much to pick apart in this ranking list, just as there is in US News. One thing that jumped out at me is the claim that WM is trying to determine if the college is "encouraging in its students an ethic of service." How does WM determine that? By Peace Corp enrollment! It's just silly to assume this is the only (or best) measure of an ethic of service.</p>

<p>From the Editors--A Note on Methodology</p>

<p>"There are two primary goals to our methodology. First, we considered no single category to be more important than any other. Second, the final rankings needed to reflect excellence across the full breadth of our measures, rather than reward an exceptionally high focus on, say, research. All categories were weighted equally when calculating the final score. In order to ensure that each measurement contributed equally to a school’s score in any given category, we standardized each data set so that each had a mean of zero and a standard deviation of one. The data were also adjusted to account for statistical outliers. No school’s performance in any single area was allowed to exceed three standard deviations from the mean of the data set. Thanks to rounding, the top few schools in each category have a final score that is displayed as 100. We have ranked them according to their pre-rounding results. </p>

<p>Each of our three categories includes several components. We have determined the community service score by measuring each school’s performance in three different areas: the size of its Army and Navy Reserve Officer Training Corps programs relative to the size of the school; the percentage of its alumni currently serving in the Peace Corps; and the percentage of its federal work-study grant money spent on community service projects. </p>

<p>A school’s research score is also based on three measurements: the total amount of an institution’s research spending (according to the National Science Foundation); the number of science and engineering PhDs awarded by the university; and the number of undergraduate alumni who have gone on to receive a PhD in any subject. For national universities, we weighted each of these components equally to determine a school’s final score in the category. For liberal arts colleges, which do not grant doctorates, baccalaureate PhDs were given double weight. As some readers pointed out last year, our research score rewards large schools for their size. This is intentional. It is the huge numbers of scientists, engineers, and PhDs that larger universities produce, combined with their enormous amounts of research spending, that will help keep America competitive in an increasingly global economy. </p>

<p>The social mobility score is more complicated. We have data that tells us the percentage of a school’s students on Pell Grants, which is a good measure of a school’s commitment to educating lower-income kids. But, while we’d also like to know how many of these students graduate, schools aren’t required to track those figures. Still, because lower-income students at any school are less likely to graduate than wealthier ones, the percentage of Pell Grant recipients is a meaningful indicator in and of itself. If a campus has a large percentage of Pell Grant students—that is to say, if its student body is disproportionately poor—it will tend to diminish the school’s overall graduation rate. </p>

<p>We have a formula that predicts the graduation rate of the average school given its percentage of Pell students and its average SAT score. (Since most schools only provide the 25th percentile and the 75th percentile of scores, we took the mean of the two.) Schools with graduation rates that are higher than the “average” school with similar stats score better than schools that match or, worse, undershoot the mark. Four schools had comparatively low Pell Grant rates and comparatively high SAT scores, and had predicted graduation rates of over 100 percent. We left these results alone to keep the metric consistent. In addition, we used a second metric that predicted the percentage of students on Pell Grants based on SAT scores. This indicated which selective universities (since selectivity is highly correlated with SAT scores) are making the effort to enroll low-income students. The two formulas were weighted equally."</p>

<p>Kvetch away...</p>

<p>So, are they using the same methodology to generate the lists of universities and LACs? Because I don't think LACs are "producing PhDs" and they tend to be low on the "cutting-edge research" also. (Yes, I know LACs produce FUTURE PhDs at a fantastic rate, but they aren't granting the degrees.) </p>

<p>And StickerShock, I agree that the Peace Corps enrollment thing is a pretty weak measurement... and it also has to do with the kinds of students the school attracts more than it has to do with the school "encouraging in its students an ethic of service."</p>

<p>I'm not sure who this list is intended for.... any one of the three criteria might be of interest, but all three together don't make much sense to me.</p>

<p>Oh I see. BAfromBC's post answers some of my questions, though not in a manner that makes me think these rankings are of much use.</p>

<p>i think its extremely important to have alternative rankings challenging the 'bible' that is the usnews</p>

<p>What I found particularly entertaining about this ranking was that its predicted graduation rates for some of the schools on the list were over 100%. Creating "predicted" graduation rates has always been one of the less than convincing parts of US News, but at least they realized that graduation rates couldn't be higher than 100%.
The other thing to note here is that these rankings are biased by size (particularly in the research category). There's nothing wrong with that if you are measuring total contribution to the nation (after all, a larger school contributes more), but when looking at some of the specific measures (say, bachelors to PhD rank) it's important to realize that these are total numbers rather than numbers adjusted by class size.</p>

<p>I actually think that the predictor of graduation rate FOR PELL GRANT STUDENTS is a superb indicator of actual quality, because it specifically looks at the "value-added" offered by the college, rather than the quality of the class going in. (They would do even better still by cross-calculating PH.D. production against Pell recipients, which would also be a measure of "value added". Comparatively fewer Pell students go in to Ph.D.s mainly because their cultural/economic horizons are lowered - they often either feel like they have to go earn money immediately, or, alternatively, see education as a means to an end rather than an end in itself. A quality college or university can change that.)</p>

<p>mini, perhaps I misunderstood your point, but it seems that they are predicting the overall graduation rate at each university based solely on the number of Pell Grantees at that university, and that whatever algorithm they are using for this produces a predicted graduation rate of over 100% for some schools with low numbers of Pell Grantees (incidentally, I think simply measuring the percentage of students on Pell Grants is very interesting, and, in some cases, including that of my own school, embarrassing). They aren't measuring the rate at which specifically students on Pell grants graduate at each school (though that might be interesting).</p>

<p>They used the best available surrogate:</p>

<p>"We have a formula that predicts the graduation rate of the average school given its percentage of Pell students and its average SAT score. (Since most schools only provide the 25th percentile and the 75th percentile of scores, we took the mean of the two.) Schools with graduation rates that are higher than the “average” school with similar stats score better than schools that match or, worse, undershoot the mark. Four schools had comparatively low Pell Grant rates and comparatively high SAT scores, and had predicted graduation rates of over 100 percent. We left these results alone to keep the metric consistent. In addition, we used a second metric that predicted the percentage of students on Pell Grants based on SAT scores. This indicated which selective universities (since selectivity is highly correlated with SAT scores) are making the effort to enroll low-income students. The two formulas were weighted equally."</p>

<p>Actually, what they did was even stronger, as they didn't confine the numbers to Pell Grantees, but looked at the actual relationship between entering SAT scores (a partial surrogate for income) and graduation rates.</p>

<p>Its nice to see that Wesleyan receives more research grant money than any research university and more than 10 time the amount of Harvard. ;-)</p>

<p>I hope WM has some erasers on their pencils.</p>

<p>And even more suspect than just the typo for millions for thousands is the fact that no one at Amherst brought in any research money? It does make you wonder--at least U.S. News is pretty decent at fact-checking.</p>

<p>Those rankings fall in the same category as The Onion's advice on selecting a college.</p>

<p>
[quote]
Choosing A College</p>

<p>April 18, 2001 | Issue 37•14 </p>

<p>The college years are a pivotal time in a person's life, not to mention a major financial investment. Here are some tips to help you choose the right school: </p>

<p>You can never go wrong choosing a college you saw advertised on public transportation. </p>

<p>There are many fine single-sex colleges where the emphasis is squarely on academics. Attend one of these only if you are a homosexual. </p>

<p>Examine the school's official crest. If it has a big pot leaf in the center, you are on the right track. </p>

<p>Find a college that will nurture your talents. For example, if you have an aptitude for dressing up in drag, penning witty quatrains, and awarding celebrities prizes as a way to draw attention to yourself, you may want to consider Harvard. </p>

<p>If you fail to get accepted at a good school, you have brought shame upon not only yourself, but also your entire family. Committing ritual seppuku is the only way to save face. </p>

<p>Schools that boast about their outstanding academic reputation are probably insecure about their inadequacies in other areas. </p>

<p>The Armed Forces Scholarship Program is a great way to earn a considerable amount of money toward college, but it has a small "joining the goddamn army" downside. </p>

<p>When consulting Playboy's annual party-school rankings, be sure to look closely at the students-per-hot-tub ratio. </p>

<p>Be wary of colleges where the chair of the history department keeps using the phrase "olden times." </p>

<p>If you are having a hard time deciding between Princeton and Yale, cry me a ****ing river, Fauntleroy. </p>

<p>Avoid colleges where the previous year's commencement speaker was Burt Ward. </p>

<p>College? Aw, man, what are you thinking about college for? You're the best metal guitarist in Winneshiek County.

[/quote]
</p>

<p>I am not sure what is more entertaining ... reading the report, reading the attempts of the editors to pretend there was a scientific method behind this "stuff", or actually reading that some do believe in parts of the reports. </p>

<p>Last year, one of the title of this "ranking" was: Is Our Students Learning? The measurements elite colleges don't want you to see. </p>

<p>Maybe, by next year, those sharp editors might figure out that their "ranking" should be titled:</p>

<p>The measurements college applicants don't want you to see!</p>

<p>The Onion article is a hoot!</p>

<p>Wow, had no idea Wes was so rich! We shoulda been paying zero tuition, with that kind of money to throw around. Four times as much as Michigan. Who'da thunk? I also question them being high on the ROTC ranking. </p>

<p>Meanwhile, five of the top LACs apparently get 0 research grants.</p>

<p>And again, the predicted grad rates of over 100 percent definitely have a Woebegonian aspect to them.</p>

<p>I don't mind seeing other rankings, if they would 1) acknowledge that their methodology is totally arbitrary (not what they consider is important, but the randomness of how they measure it) and 2) not be so sloppy.</p>

<p>I take this the same way as I take the US News rankings - additional data with variable relevance.</p>

<p>Marathonman wrote:</p>

<br>


<br>

<p>Well, if they were only measuring NSF grants, it would not be all that surprising. It doesn't mean that Amherst doesn't attract its share of research grants (Hughes Foundation, Alfred P. Sloan, etc.) just not fed money and just not that particular year.</p>

<p>Speaking of rankings, did anyone else watch Gwen Eiffel's (sp?) interview last night, on PBS, with Brian Kelly (USNWR) and Lloyd Thacker (xiggi's favorite person)?</p>

<p>I was disappointed that both seemed to talk around the questions that were being asked of them. Kelly just kept saying his publication was great; LT that it should be different, but when asked, neither stated how these publications should change, what they should measure, how they should be organized.</p>