<p>Harvard University 1
Yale University 2
Princeton University 3
University of Chicago 4
Brown University 5
Columbia University 6
California Institute of Technology 7
Stanford University 8
Northwestern University 9
Dartmouth College 10
Boston College 11
University of Pennsylvania 12
Southern Methodist University 13
Cornell University 14
Duke University 15
University of Notre Dame 16
Massachusetts Institute of Technology 17
Johns Hopkins University 18
Wake Forest University 19
Emory University 20</p>
<p>^ It's based on their nutty methodolgy (from the same guy--sorta--who brought us the flat tax :) ):</p>
<p>
[quote]
Our measures begin with student evaluations posted on Ratemyprofessors.com, a nine-year-old site with 6.8 million student-generated evaluations. We look at college graduation rates (as does U.S. News). We also calculate the percent of students winning awards like Rhodes Scholarships and undergraduate Fulbright travel grants. For vocational success we turn to Who's Who in America. Though imperfect, it is the only comprehensive listing of professional achievement that includes undergraduate affiliations.
<p>After saying, "I think the U.S. News rankings ought to get a D", Vedder proceeds to use the USNWR rankings to determine the list of schools he will rank. Notice that he only ranks USNWR's top two tiers of schools in each category and leaves out third-tier schools. If a school such as Samford on the Nat Univs. list can move from 118 (USN) to 27 (CCAP), then it stands to reason that some third-tier schools might well move up into the top 100, or top 50. The same is true for the LAC list where Bennington jumped from 106 (USN) to 33 (CCAP).</p>
<p>While imperfect, maybe the CCAP can be improved upon by Vedder or others. At least it's a start. Unfortunately, Forbes has the same motive that USNWR does in putting together these rankings: selling magazines.</p>
<p>I have to say, it is good to finally see some of the nation's most underrated but excellent schools, namely Brown, U of C, Northwestern, ranked on par with the big names.</p>
<p>Check here for the complete article and links to LAC and public rankings as well.</p>
<p>Forbes isn't always the most reliable source in the world. That said, this list was actually complied by CCAP, a completely separate entity. </p>
<p>Anyway, I like it. I think it is kind of refreshing to finally see rankings based on the result in students' lives (their happiness, their success, etc.), rather than $$$-and-power-based rankings.</p>
<p>Except for SMU, the rest of the rankings look very plausible. They shouldn't be discounted just because of one outlier. Maybe the students at SMU are told to only post good ratings on ratemyprofessors, or maybe a graduate there invented the Who's Who list so lots of students try to get on it. There could be a whole host of reasons for SMU to be up that high in the rankings that don't discount the rankings as a whole.</p>
<p>Um... there is no reason to cramp for the ACT. With a 21-27 score, you can get into the Great University of Alabama, which is just below Michigan but a notch above UDub, UT-Austin, Georgia and BU... and way ahead of Wisconsin, USC and UIUC.</p>
<p>Good to see the junior UCs being put in the rightful places, rounding out the top 100 ... refreshing indeed.</p>
<p>Why are people getting so offended by these rankings? They're not the final word. They're just a glimpse at how rankings would be if they looked at alternative factors. </p>
<p>At Alabama, students may really love their professors, graduate in 4 years, enjoy the benefit of many scholarships and awards, and have an easy time finding jobs after graduation. Does that make it a better school than UCSD? No! It just means that Alabama performs better in these few categories.</p>
<p>UCSD likely enjoys higher funding, more selective entrance, and more famous faculty. Does that make UCSD the better school? No, not necessarily. It just means that UCSD performs better in these few categories. </p>
<p>This new rating isn't claiming to be perfect - it's just there to serve as an alternative for those who care more about the former factors than the latter.</p>
<p>Well put. This is just a rating based on externally verifiable quantitative factors. If there's an outlier or two in the list, so what? Is this any worse than USNWR's letter seeking peer factors? Or USNSWR's arbitrary weights? I think not.</p>
<p>Think of it this way: To get rankings that "look right", the ranker (e.g.) USNWR must manipulate the factor weightings in an arbitrary way to get to the list wanted. I'm sure USNWR does this now, which is one reason to take their lists with a grain of salt. To ask others to insert more arbitrariness in the process is not reasonable.</p>
<p>It looks like a useless ranking to me, like any other ranking based on "student satisfaction". Besides, a ranking that uses "Who's who" to measure alumni success cannot be possibly serious!</p>