<p>Ranking methodology must be seriously flawed if social life and location/other non-academic aspects are weighted so heavily that UC Davis and Texas A&M are ranked before UCLA, Northwestern, Columbia, Georgetown, Chicago, WUSTL</p>
<p>Also, apparently University of Phoenix (for-profit, like ITT Tech for those who aren't familiar) is a "Featured College"</p>
<p>They are garbage. But people typically like any ranking that puts their university in a good light. It’s biased against universities in large cities (UCLA: 41; Columbia: 46) since those cities are expensive to live in. Sure, you can skew data to make Columbia come out at 46/50. That doesn’t mean that anyone will take your ranking seriously.</p>
<p>The fact that a college didn’t come out in the rankings where you thought it should have doesn’t make the methodology garbage a priori. Odd sample choices, unreliable sources of data, lack of a theoretical base, and poor presentation are all reasonable criticisms of methodology. That you don’t like the outcome isn’t.</p>
<p>^ Agreed. It’s a ranking system like many others which has its own issues of what data to use and what weighting of each attribute. While you may not like their rating system it’s not necessarily garbage. You just disagree with their criteria or weighting.</p>
<p>I’ve seen more ranking lists than I can shake a stick at, and none of them have much overlap save for, say, the top 10. Just out of curiosity, what’s the best college rank list in you guys’s opinions?</p>
<p>That said, both of those were literally just a reshuffling of schools ranked by US News. Were I to do it over again I would not start with US News as my source of colleges to rank. I’d go straight to IPEDS using Carnegie characteristics.</p>
<p>lynxinsider, I’m trying to understand the rankings in the papers you cite. Is this a good summary?</p>
<p>You start with a set of inputs stuch as cost, 25th percentile SAT scores, percentage of full-time faculty, etc. For this set of inputs, you figure out what the optimal graduation rate should be. The universities are then scored by the ratio between the actual and optimal graduation rates.</p>
<p>To calculate the optimal graduation rate for the given inputs, you consider all possible ways to combine existing universities so that the weighted average inputs is the same as the given inputs. For each of those combinations, you calculated the average graduation rate. The maximum of all of these graduation rates is the optimal graduation rate.</p>
<p>So most rankings will ask “What does this university have, in terms of students, faculty, etc, and how does that compare to what other universities have?” Your rankings instead ask “What does a university have, and how successful is it at producing college graduates given its resources?”</p>
<p>But why use graduation rates? Why not job placement, or earnings?</p>
<p>You do start with an output (in this case grad rates) and inputs (those you mentioned). </p>
<p>You don’t calculate a single optimal graduation rate - you calculate which colleges have maximized their graduation rates given the mix of inputs that they have to work with. It’s a mathematical problem with many solutions. Those colleges form a sort of boundary. Everyone else’s score is calculated on how far their actual graduation rate is from that boundary of what has been shown to be possible by other colleges with similar inputs.</p>
<p>You are correct in the questions the rankings ask. The idea that Amherst or Princeton will always be at the top of rankings because they have every possible resource at their disposal. But colleges with less than that sometimes do extremely well with what they have. Others do poorly with what they have. This separates the two groups.</p>
<p>What I like about Archibal and Feldman’s ratings is that if you’re interested in a particular school, you can see how well it does with what it has. You can also see what other schools are similar in terms of resource availability but do a superior job. It also tells you on what dimension those alternatives really shine. </p>
<p>I learned, for instance, that St. Michael’s College gets Rhodes-level graduation rates with students whose test scores are lower than what Rhodes accepts. If I was a student with lower SAT scores, that might make St. Michael’s worth looking at.</p>
<p>I must say, rjk, that I like it too. That being said, I generally don’t trust rankings. There is no way to weigh class offerings, rigor of instruction, teaching prowess of professors/TAs, social life, location, student support services, school spirit, cost, employment statistics for graduates, how many Rhodes/Marshall/Fulbright scholars are produced (I must have left something out) in an exact formula that would be satisfactory to every single student. The “best” college is one that is the best for a particular student.</p>
<p>The methodology is just incredibly arbitrary. To be considered for this top 50 list, you must have appeared in the top 50 of one of our competitors. LOL really?</p>