<p>I like using WM rankings over the USNWR due to methodology. They use information such as social mobility, research, and service data from the Peace Corp and ROTC to help compile their rankings.</p>
<p>Take a look at Washington Monthly. They offer a lot of extra information other rank sources such as QS, THE, USNWR do not.</p>
<p>Always note methodology when you're looking at rankings guys. Rank isn't everything, but it's a great indicator to get a feel for where colleges and universities stand.</p>
<p>Pell grant students means how many poor students go there, Net Price not matter depending on the amount of aid given, research expenditures depends on endowment, Science and Tech PHD’s awarded depends on size, Peace corps depends on the type of students it attracts, along with ROTC. The rest are fairly decent measurements, but none of those reflect on the school. I mean the cost of the school? How many kids that have pell grants go there? Peace corps, and ROTC? What do those have to do with the school? The only “Rankings” that really matter is looking through the CDS and looking for traits you want, i.e. High graduation rate, people like you, graduate school placement in your desired field, etc. I’m sorry, I think Washington Monthly is trying to hard to make, what is essentially, a “hip” schools list. I’m just surprised number of LEED certified buildings isn’t on there. I can understand that this ranking would be good for some people (For example a poor student who wants to be in the military after getting a Science degree), but it’s not a very broad ranking.</p>
<p>Washington Monthly publication is widely regarded as one of the best rankings out there for the reason that they branch out in terms of methodology. USNWR’s survey system is very, very objective and holds little ground for students. The response rate for US News surveys are also very low.</p>
<p>Washington Monthly is an excellent way for students to learn more about certain measures that appeal to them.</p>
<p>It’s regarded because it’s unique, that doesn’t mean that it is useful. Unless you fit the brand of student they are catering to (Peace Corps rankings, ROTC rankings, pell grant, etc. cater to very specific students) it holds no weight, along with the rest of the rankings. Rankings are pointless, no matter how they are done. The only way rankings would matter is if they apply to nearly everyone i.e. graduation rate, retention rate, etc. and are objective. Even then, the ranking would still be fairly pointless.</p>
<p>They aren’t rankings in the traditional sense. WM offers a unique look in detail that the other ranking publications do not. Obviously MSU is not a better school than Columbia University.</p>
<p>Unique and pure junk describe the Mother Teresa rankings. Take a look at the school ranked just below Harvard. UTEP is one of the worst state schools. WM rewards it because it exceeds its 30 percent graduation rate and is known for its ability to milk the Pell grants.</p>
<p>Yep, that is really helpful a ranking. /sarcasm</p>
<p>Hey, this is America. Pick and choose from any rankings, parts of rankings - or NO rankings - that matter to YOU. </p>
<p>One feature I like about the WM web site is the power to click-sort on individual columns. For me, the “research” column is generally of greatest interest. I’m mildly interested in the Peace Corps stack-up, but not very interested at all in the other “service” columns. </p>
<p>The “predicted grad rate” v. “actual grad rate” comparison does result in some pretty weird effects (penalizing schools that start out with the highest “predicted” rates). Other than that, I don’t see how this ranking is much junkier than any other ranking. What it offers is a lens on somebody’s idea of social benefit, though it’s not necessarily my idea. I think college professors should create and spread knowledge. That is their primary social benefit, in my opinion. The “research” columns provide a perspective on that, but one that fails to assess the knowledge-spreading that happens in individual college classrooms (in other words, teaching quality, which may be affected not only by faculty credentials but also by class size and student preparedness). USNWR seems to be based on better criteria to assess which schools do the best job of bringing high-stats students together with well-paid professors in small classes (which may or may not correlate very well with which schools actually do the best job of bringing highly-motivated, hard-working students together with good professors in small classes, to create the best learning experiences). </p>
<p>(“Mother Teresa rankings” is pretty apt, though!)</p>
<p>The Washington Monthly rankings are useful for looking at what they measure, primarily indicators of social mobility and service, but are not particularly useful beyond that, and like the USNWR rankings have some glaring flaws. To take 3 schools with similar student bodies (and with which I’m familiar), Colby, Bowdoin and Bates, would anyone really argue that Bates is a better college than Bowdoin because the average SAT at Bowdoin is higher and the admit rate is lower than that at Bates? Despite the fact that Bowdoin’s FA is at least as good as Bates’ and Bowdoin is need blind, and Bowdoin has a higher percentage of minority students, the WM ranking use SAT scores to predict that Bates has a higher percentage of Pell grant recipients, and thus should have a lower predicted graduation rate.</p>
<p>“We first predicted the percentage of students on Pell Grants based on the average SAT score and the percentage of students admitted. This indicated which selective universities (since selectivity is highly correlated with SAT scores and admit rates) are making the effort to enroll low-income students.”</p>
<p>While it does look at COA, this ranking does not look at average indebtedness upon graduation, meaning that a college that is expensive but has great FA for low and middle income students is hurt in the rankings, while a college that is a couple of thousand dollars less expensive but provides little or no FA can skate on by with a high ranking.</p>
<p>Similarly, is Colby a better college than Bowdoin and Bates because it allows participation at UMaine’s ROTC program, a program of which very few students take advantage? And why is participation in the Peace Corps the measure of community service beyond college rather than Teach for America, work in the nonprofit sector, or any of a dozen other measures? Since less than a quarter of one percent of college graduates end up in the peace corps (approx. 1,700,000 graduates, 8655 Peace Corps volunteers, committed for at least 2 years, so divide in half for approx. annual #s), any tiny differences in the data are bound to be blown out of proportion.</p>
<p>Washington Monthly rankings are useless at face value, however they are useful for looking at what type of students different schools attract (by sorting through the different criteria)</p>