<p>I saw this as a thread on the Whitman forum. Forbes came out with their own system to rank colleges. While no system is perfect, at least this one is trying to base its rankings on what students do after their graduation.</p>
<p>
[quote]
This time of year, as they make the momentous decision of where to go to college, high school seniors are turning to popular rankings compiled by magazines like U.S. News & World Report. There are competing scorecards from the Princeton Review and Kiplinger's, but U.S. News' product is way out in front in visibility; in addition to its usual circulation of 2 million, it sells 9,000 newsstand copies and some 20,000 of its college guide book.</p>
<p>U.S. News evaluates educational quality by looking inside colleges at measures like faculty-student ratios, admissions selectivity, financial resources and alumni giving.</p>
<p>I think the U.S. News rankings ought to get a D. They're roughly equivalent to evaluating a chef based on the ingredients he or she uses. At the Center for College Affordability & Productivity, a two-year-old research organization in Washington, D.C. with a free-market bent, we evaluate colleges on results. Do students like their courses? How successful are they once they graduate? In short, we review the meal.
<p>Forbes cannot be serious. It uses Ratemyprofessors.com as the basis for determining student satisfaction with professors? Could Forbes have possibly found any less scientifically sound sample anywhere?</p>
<p>All existing college rankings publications are flawed one way or another. While I agree with Forbes' premise of evaluating the meal instead of the ingredients, using "ratemyprofessors.com", is definitely not the right way to go about ranking colleges.</p>
<p>University of Missouri 53rd and Oregon 54th? Southern Methodist 13th?</p>
<p>It looks like selectivity plays a bizzare role... large schools with 80%+ acceptance rates are ranked above other large schools with 20% acceptance rates.</p>
<p>
[quote]
The top CCAP schools rank near the top of the U.S. News list, as the accompanying table shows. But just below the top there are some surprises. Duke, MIT and the University of Pennsylvania make the top 10 list at U.S. News but not at CCAP. Duke students don't rate their professors high enough. At the University of Pennsylvania not enough grads made it into Who's Who. Brown and Northwestern, both ranked 14 by U.S. News, and Dartmouth College, ranked 11 by U.S. News, all make it onto our top 10. The University of Alabama, which got great reviews from students, came in a number 7 on our national public university ranking; it's at position 42 on U.S. News' list.
[/quote]
</p>
<p>Maybe Blue Devils are just a bunch who always think that their education can be improved.. I mean, we are the entrepreneurs of the US. :D
That's at least what the admissions booklet told me, haha!</p>
<blockquote>
<p>Forbes gives USNWR list a D<<</p>
</blockquote>
<br>
<p>And I give Forbes an F. </p>
<p>I'm no big fan of USNWR, but Forbes' method of ranking has achieved a new low. Using "Who's Who"? Come on! You could get a more accurate college ranking just by starting a thread here on CC, letting it run for a week, and then taking an average or consensus of what all the high school kids posted.</p>
<p>Most of these lists are plainly crap! Why? They don't take the quality of individual programs into account and leave out specialty schools. For example, Harvard, Yale, Princeton and Stanford might all be top school unless you want to major in subjects like art, music, design, etc. In addition, some schools might be rated overall weakly,but have some amazing programs.
University of Cincinnati comes to mind. </p>
<p>They are generally considered a tier 3 school. Yet they have several nationally ranked programs in music, musical theater, design, architecture and criminology. </p>
<p>In addition, there are a lot of factors that go into the rankings that are , in my opinion, irrelevant such as alumni giving and freshment retention. For example, if a school is very hard and requires high standards, they might have a lower freshmen retention than one that doesnt have the same standards or give a bit of grade inflation.</p>
<p>Some specialty schools are as well regarded as Harvard such as Harvey Mudd, Pomona etc. Thus,you really, really need to check out the top programs that you would be interested in besides looking at these overall rankings.</p>
<p>no one rankings survey/methodology can be perfect without any flaws. yes, most of the survey results are based on the ingridients vs on what come out as the meal. but to rate on your satisfaction level is quite subjective and thus a lot more difficult to compare on the same yardstick.</p>
<p>how do you compare, say, the meals between cheeseburger vs delicate steak having the same highest satisfaction score of 5.</p>
<p>nevertheless, i feel maybe adding somewhat subjective satisfaction scores, with about say 10% weighting into the creteria seems a better component than alumni giving which naturally favors elites with big endowments.</p>
<p>^^Exactly. A great meal almost always calls for a great cook AND superb ingredients. </p>
<p>There is no "ideal list". Unlike the total wealth wich can be expressed in $$$, college "productivity" can not be measured using a simple numerical parameter. I went over the website of this Center for College Affordability & Productivity, but could not find any information even remotely hinting at how they managed to produce their ranking formula based on this rather odd mix of criteria.</p>
<p>That's actually a pretty funny ranking. USC is down in the 50's, and I'll tell you why- we don't use rate my professor. We have our own teacher ranking system run by the university's student government where you can put course and professor reviews. Only people who are really angry at a professor for some reason and want to spam their name will use the rate my professor site. Hmm...</p>
<p>So the consensus is that ratemyprofessor.com is unreliable. What about the other two mentioned?</p>
<p>
[quote]
Our measures begin with student evaluations posted on Ratemyprofessors.com, a nine-year-old site with 6.8 million student-generated evaluations. We look at college graduation rates (as does U.S. News). We also calculate the percent of students winning awards like Rhodes Scholarships and undergraduate Fulbright travel grants. For vocational success we turn to Who's Who in America. Though imperfect, it is the only comprehensive listing of professional achievement that includes undergraduate affiliations. (Our complete listing of more than 200 schools can be viewed at Forbes.com.)
[/quote]
</p>
<p>Is using "Who's Who" worse than, say, number of people working in Goldman Sachs, which certain people seem to like using <em>over and over</em> on CC? I don't know much about WW list but I thought it covers a much wider array of sectors.</p>
<p>Actually, despite the sketchy methodology of ratemyprofessor.com, some scholars have found that Rate My Professors ratings are highly correlated with far more respectable institutional assessment systems, as an article in last week's Inside Higher Education reported:</p>
<p>
[quote]
Validation for RateMyProfessors.com?</p>
<p>You’ve heard the reasons why professors don’t trust RateMyProfessors.com, the Web site to which students flock. Students who don’t do the work have equal say with those who do. The best way to get good ratings is to be relatively easy on grades, good looking or both, and so forth. </p>
<p>But what if the much derided Web site’s rankings have a high correlation with markers that are more widely accepted as measures of faculty performance? Last year, a scholarly study found a high correlation between RateMyProfessors.com and a university’s own system of student evaluations. Now, a new study is finding a high correlation between RateMyProfessors and a student evaluation system used nationally.</p>
<p>A new study is about to appear in the journal Assessment & Evaluation in Higher Education and it will argue that there are similarities in the rankings in RateMyProfessors.com and IDEA, a student evaluation system used at about 275 colleges nationally and run by a nonprofit group affiliated with Kansas State University.
<p>One could have thought that nothing would be more moronic and less scientific than the Princeton Review surveys. Then comes Washington Monthly with the Mother Teresa inspired rankings ... but this one takes the prize. </p>
<p>SMU in Dallas among the best 15 schools in the nation? What were those rankers smoking? Waco's Baylor ahead of UCLA? And, it is better not to mention the LAC list at all! </p>
<p>When do the rankings of the National Enquirer and Jerry Springer come out?</p>
<p>Now that I've had a longer look at the rankings, I think part of the problem is that the Forbes method tends to overrate old colleges in long-term decline and underrate new colleges that are on a fast rise.</p>
<p>I don't think that the Forbe ranking is too out of line. The top 10 schools are legitimate. The use of Who's Who is not perfect, but again is not without merit. For example, Who's Who in the World included 60,000 people. Though it's kind of like a directory, a lot of the listee are high achieving people. If a school produces a disproportion number of graduates in Who's Who, that says something about the school.</p>
<p>The problem with the Forbe method lies in the other part: Ratemyprofessors.com. It is conceivable that a less well-known school like SMU can produce similar ratings as Harvard, just like China produces a similar happiness index as US.</p>
<p>I am pretty sure that Duke won the Who's Who while SMU won Ratemyprofessor.com.</p>