<p>Fairly interesting rankings from the Center for College Affordability & Productivity (CCAP), a two-year-old research organization in Washington, D.C. with a free-market bent. ("We evaluate colleges on results. Do students like their courses? How successful are they once they graduate? In short, we review the meal.")</p>
<p>Rankings are based on:
1. Student evaluations posted on Ratemyprofessors.com (a nine-year-old site with 6.8 million student-generated evaluations)
2. College graduation rates
3. Percent of students winning awards like Rhodes Scholarships and undergraduate Fulbright travel grants
4. Vocational success using Who's Who in America ("Though imperfect, it is the only comprehensive listing of professional achievement that includes undergraduate affiliations")</p>
<p>Their rankings:
1. Harvard
2. Yale
3. Princeton
4. University of Chicago
5. Brown
6. Columbia
7. Cal Tech
8. Stanford
9. Northwestern
10. Dartmouth</p>
<p>They also rank National Public Universities and Liberal Art Colleges.</p>
<p>I wish they'd give a bit more detail on methodology in the online addition. I suspect there's a lot of year to year variability in the award measure, so I'd like to know if they average over a number of years, to use an example.</p>
<p>Forbes.com recently posted an article written by Richard Vedder of Center for College Affordability & Productivity, a two-year old research organization that lists the top US colleges using different criteria than USNWR uses. This different sort of ranking, based on quality of faculty and the success of graduates, might help students and parents when considering where to apply.</p>
<p>
[quote]
Our measures begin with student evaluations posted on Ratemyprofessors.com, a nine-year-old site with 6.8 million student-generated evaluations. We look at college graduation rates (as does U.S. News). We also calculate the percent of students winning awards like Rhodes Scholarships and undergraduate Fulbright travel grants. For vocational success we turn to Who's Who in America . . . The top CCAP schools rank near the top of the U.S. News list, as the accompanying table shows. But just below the top there are some surprises.
<p>One could have thought that nothing would be more moronic and less scientific than the Princeton Review surveys. Then comes Washington Monthly with the Mother Teresa inspired rankings ... but this one takes the prize. </p>
<p>SMU in Dallas among the best 15 schools in the nation? What were those rankers smoking? Waco's Baylor ahead of UCLA? And, it is better not to mention the LAC list at all! </p>
<p>When do the rankings of the National Enquirer and Jerry Springer come out?</p>
<p>The consensus is that ratemyprofessor.com is unreliable. I can see how using graduate rate is going to hurt the publics pretty bad, more so than in USN when it's now one of the only four criteria. What about the other two mentioned?<br>
[quote]
Our measures begin with student evaluations posted on Ratemyprofessors.com, a nine-year-old site with 6.8 million student-generated evaluations. We look at college graduation rates (as does U.S. News). We also calculate the percent of students winning awards like Rhodes Scholarships and undergraduate Fulbright travel grants. For vocational success we turn to Who's Who in America. Though imperfect, it is the only comprehensive listing of professional achievement that includes undergraduate affiliations. (Our complete listing of more than 200 schools can be viewed at Forbes.com.)
[/quote]
Is using "Who's Who" worse than, say, number of people working in Goldman Sachs, which certain people seem to like using over and over on CC?</p>
<p>College rankings is NOT a scientific process, even when it appears that it is. There are too many intangibles. The more information a student has, the more likely he is to make the right decision when evaluating both academics and fit. </p>
<p>The problem with USNWR, as "scientific" as it may seem, is that it places too much emphasis on incoming students and not on the outgoing ones, which is the true measure of a quality education. I agree that using ratemyprofessors.com is pretty lame (okay, VERY lame), particularly since only a small percentage of students actually enter their ratings. However, the Fulbright and Rhodes scholars is a valid measure. Too bad they can't also measure where graduating students attend graduate school.</p>
<p>As a texan, I thought you'd love the elevated scoring of some schools down there. </p>
<p>When you talk about strange rankings, is SMU in this any worse than Notre Dame in USNWR? Loyal alumni give ND dough in high percentages, so that makes a great school?</p>
<p>Rate my professors, Sam Lee? Maybe any one ranking is not so hot, but in aggregate, you can get useful information from things like the site, like these researchers did.</p>
<p>But, without seeing their methodology - weightings, scoring and such - it is hard to know if this is garbage as Xiggi thinks or just another interesting perspective.</p>
<p>
[quote]
Now, now, Xiggi. Take a deep breath. College rankings is NOT a scientific process, even when it appears that it is.
[/quote]
</p>
<p>It is not a scientific process, but when the methodology shows clear signs of using elements that are pure garbage (and one misleading item that "looks" good) as input and when the output confirms the old GIGO say, I'd say that the validity of such an exercise is nothing but a joke. </p>
<p>Who is Who? Rate my professors? How does a school** that stands for "Pay your fees/Collect your bees" compare to Caltech or Harvey Mudd in graduation rates? The easiest to graduate ought to be the best? </p>
<p>One can be different but when you end up placing schools such Wabash and Whitman well above Grinnell, you know something is wrong. Deeply wrong!</p>
<p>**That is SMU!</p>
<p>
[quote]
However, the Fulbright and Rhodes scholars is a valid measure.
[/quote]
</p>
<p>There is a world of difference between one award being incredibly prestigious and representing a valid measurement of a university quality of education. What is the PERCENTUAL and ANNUAL representation of the Rhodes for ALL universities in the country? </p>
<p>Are all Fulbright equal or do you make a difference between the Teaching Fulbright and the others? Do we need to, as an example, compare the performance of Brown's German Department with ... Smith's or Pitzer's? </p>
<p>Prestigious ... Absolutely! Used as a valid measure for anything but the performance, resources, and REASONS of a school to increase its Fulbright ... Absolutely NOT!</p>
<p>"i've been getting tired of the "wall street placement" talk (as if schools' mission is all about putting people into ibanking"</p>
<p>I agree with this, Sam Lee. I see nothing wrong with selecting other criteria, if only to see how the rankings change. I find it interesting that the topmost schools are fairly consistent from list to list, with minor shuffling, but that once you get below the top five, all bets are off.</p>
<p>I'd like to see the methodology too. But the way I see it, this one is as good as any other IMO. If nothing else, I like the attempt to use measures based on the "meal" rather than just a compilation of the ingredients, lol. Works for me just fine.</p>
<p>My point is only that while individual professor ratings may be crap, collectively the "crap" can amount to something. </p>
<p>To start, I think there is no question that overall the ratings say something about the student body. Do the overall ratings reflect more on the students or faculty members? Who knows. But does it matter? If overall the students are pounding on the faculty, then either the faculty are lousy or the students are overly demanding. Either way the school has a problem, at least with expectation matching. So just based on this I can see value.</p>
<p>But this only works where there are enough ratings that no one person or a small group can skew the results. Is that the case here? Don't know.</p>
<p>Agree it is nice to see some top schools outside the northeast. I was also pleased to see some southern schools do well. And I'm a yankee...</p>
<p>I agree that it's nice to see a different mix of schools there.</p>
<p>Anytime a new ranking comes out, you'll see people whose favorite colleges got moved down dismiss it and people whose favorite colleges got moved up applaud it. People should realize that if a school is a personal favorite, it doesn't matter where study X or study Y puts it.</p>
<p>Those who are searching for a school should look at multiple rankings (if any), but they should mostly concentrate on the school itself: academic programs, student life, and general atmosphere. On CC, I see students ask questions such as, "Which should I choose? Yale, Princeton, or Amherst?" Those three schools couldn't be more different. Yeah, in the rankings, they are close, but that's one of the few things they share.</p>
<p>Getting back to the ranking done by the CCAP: they seem to be trying to find a way to measure the actual education the students receive. Are smart students made more ambitious and capable, or are they burned out or allowed to coast? Given a highly intelligent student body, which schools really challenge? I don't really know how one can measure that on a large scale, which is probably why this group went to ratemyprofessors and Who's Who. (Someone in a different forum -- I can't remember which one -- linked to a Chronicle of Higher Education article that said there was a high correlation between RMP.com and actual student evaluations, that it was a more valid assessment than people had previously believed.)</p>
<p>Agree with looking beyond rankings. What puzzles me are all the posts regarding what is the "best" for any of dozens of measures. I don't even know what best means for any one measure.</p>
<p>And worse is when you see posters ask for the "best" graduate school institutions. Not departments, but overall. Yikes. Then on top of that are those who argue that going to the "best" really matters. Of course these folks usually inhabit ivy land and a number are in Cambridge MA, for whatever that means.</p>
<p>I've seen students who have said, "You gotta go to School X because it's so much better than School Y" based on the USNWR rankings -- and these two schools might be only five slots apart. Or one. Sometimes it seems on CC that the only valid reason for going to a lower USNWR ranked school is finances.</p>
<p>It seems to me that if you took the type of students who are accepted to the above mentioned schools and put them anywhere, you would find that there is an exceptionally high rate of achievement from them. especially if your are dropping out all of the low performers from your statistical pool, which these schools do. The question becomes--which is more responsible for the high rankings--this very, very select group of high-performing students who just happen to go to School X, or is School X responsible for the success of their attendees? I would think that if you go to any number of colleges throughout the country and look at just the very highest performers and drop out the data from all of the other students, you would come up with equally impressive stats.</p>
<p>An intelligent, motivated individual can succeed with or without a top ranked school; however, not all educations are created equally. The question is not "Can a top student succeed at a lesser university/college?" but "Which universities/colleges offer the best education for a top student?"</p>
<p>Interesting ranking, although I'm pretty suspicious of their use of ratemyprofessors.com. I don't know how it's used at the other schools on that list, but at Chicago almost nobody uses ratemyprofessors because there's an internal rating website that's linked to your actual course schedule and allows for a longer and more detailed response. From what I've seen, the few reviews for Chicago professors on RMP have actually been more negative (and often sounding like a grudge rather than a fair review) than the ones on the evaluations website - in any case, it's hard to trust a site that doesn't make sure you've actually taken classes from the professor in question or screen in any meaningful way (as far as I know). </p>
<p>Ratings in general are pretty stupid anyway, but I can't help but feel a little undeserved ego boost from seeing Chicago edge into the top five.</p>