Rankings

<p>Why is US News and World Report so popular? What makes it any better than say, the Gourman Report? </p>

<p>What other rankings are there, and how do they differ from each other?</p>

<p>There’s Harvard, Yale, Princeton, Stanford; MIT and Caltech. Then, the rest.</p>

<p>You need rankings to tell you that?</p>

<p>I mean, it’s the only ranking that pretty much supports conventional wisdom. HYPSM is always first no matter what, then Ivies, then etc…</p>

<p>60% 2007, 53% 2008, 43% 2009 are each considered very high response rates for a USNews mail in survey. A high response rate is the key to legitimizing a survey’s results. I’m pretty sure the peer component of USNews is only rivaled by that of the NRC rankings (which comes out once every 10-15 years and solely focuses on doctoral/graduate education)</p>

<p>When a survey elicits responses from a large percentage of its target population, the findings are seen as more accurate. Low response rates, on the other hand, can damage the credibility of a survey’s results, because the sample is less likely to represent the overall target population.</p>

<p>For what it’s worth, USNews probably has the most comprehensive collection of academic opinion on undergraduate education, far exceeding that of other rankings which mostly rely on hard objective points or “RateMyProfessor.com”/“Who’sWho” like the Forbes ranking which totally suck.</p>

<p>Gourman report is CRAP because it’s by Princeton Review and Princeton Review reveals nothing about the methology behind the Gourman reports. It is statistically flawed since programs differ by 0.1 point and there are no ties (are you kidding me, NO TIES?)…Jack Gourman has failed to provide an explanation of rationale and methodology.</p>

<p>Some are absolutely worthless, some are not. It just depends. You can look on Wikipedia for various rankings. Basically, you can find a ranking that tells you what you want to know. At least the USNews one uses tons of data and is open about it.</p>

<p>[Gourman</a> Report - Wikipedia, the free encyclopedia](<a href=“http://en.wikipedia.org/wiki/Gourman_Report]Gourman”>Gourman Report - Wikipedia)</p>

<p>Gourman report (just like Mini’s undergraduate IR rankings) even report ranks of nonexistant programs. Columbia/UC Berkeley does not have anything remotely related to international affairs/studies/relations yet somehow it is ranked under Mini’s undergraduate IR ranking. It make no sense whatsoever.</p>

<p>I only trust the US news.</p>

<p>Basically, rankings vary wildly because they use such arbitrary criteria. In addition, LACs and universities (e.g. Princeton) w/o professional schools suffer in many. If you are a math/science person, the rankings might be more useful because most weight (arbitrarily) their criteria very heavily towards the natural sciences/related awards, etc.</p>

<p>The Gourman Report was last issued 13 years ago.</p>

<p>

</p>

<p>“Common sense” is many times flawed. Based on “common sense” for example, people used to believe that the sun revolved around the earth.</p>

<p>Anyway, I find the methodology used by the [url=<a href=“http://www.topuniversities.com/worlduniversityrankings/methodology/simple_overview/”>http://www.topuniversities.com/worlduniversityrankings/methodology/simple_overview/&lt;/a&gt;] London Times World Rankings <a href=“THES”>/url</a> better than USN&WR’s as it also includes employer review, student/faculty ratios and an objective measure of research excellence (citations per faculty). Furthermore, THES also has the advantage of using an international database (perceptions overseas can be very different, as some domestically “highly prestigious” institutions may be actually virtually unknown internationally). </p>

<p>US News on the other hand is based solely on a somewehat narrower peer review and, in the case of the general rankings, on factors such as selectivity, completion rates and yields, which, in my opinion, are not the best indicators of quality.</p>

<p>The problem with rankings is that people take them too literally. There are thousands of universities in the US. Veryt little separates universities within 15-20 spots of each other. </p>

<p>Amherst College
Bowdoin College
Brown University
Bryn Mawr College
California Institute of Technology
Carleton College
Carnegie Mellon University
Claremont McKenna College
Colgate University
College of William & Mary
Columbia University
Cornell University
Dartmouth College
Davidson College
Duke University
Emory University
Georgetown University
Grinnell College
Harvard University
Harvey Mudd College
Haverford College
Johns Hopkins University
Macalester College
Middlebury college
Massachusetts Institute of Technology
New york University
Northwestern University
Oberlin College
Pomona College-
Princeton University
Rice University
Stanford University
Smith College
Swarthmore College
Tufts University
United States Military Academy-West Point
United States Naval Academy-Annapolis
University of California-Berkeley
University of California-Los Angeles
University of Chicago
University of Illinois-Urbana Champaign
University of Michigan-Ann Arbor
University of North Carolina-Chapel Hill
University of Notre Dame
University of Pennsylvania
University of Southern California
University of Texas-Austin
University of Virginia
University of Wisconsin-Madison
Vanderbilt University
Vassar College
Washington and Lee University
Washington University-St Louis
Wellesley College
Wesleyan University
Williams College
Yale University</p>

<p>If one were to rank those 50+ colleges and universities listed above (and I am sure I left out a few others that belong) based on any number of weighted criteria, the output could mislead one into believing that there is a serious gap between #15 and #45. There isn’t. In fact, there is probably no gap whatsoever.</p>

<p>While USNWR may be the best known of the rankings, I emphatically believe that one should focus less on the absolute rankings that their methodology produces and more on the individual factors that they measure. Consider the datapoints that are important to you and see how the colleges truly compare to one another. Such an approach has the benefit of separating out the hype from the reality. </p>

<p>In looking at the datapoints, I submit that the most important measurements for evaluating an UNDERGRADUATE academic environment are the following:</p>

<ol>
<li> What is the quality of your student body peers? Usually the stronger the student quality, the more you benefit from the in-class discussions and the post-graduate networking.</li>
<li> What is the size of the classroom in which you learn? Usually smaller classes are favored over larger classes so as to benefit from peer interactions and more intimate interaction with the professors.</li>
<li> What is the quality of the teaching that is going on at the college? This is one of the most difficult factors to weigh. But the difficulty in judging this should not minimize its importance. Attending a college with a renown faculty rep within academia does NOT necessarily mean a good environment for undergraduates. You have to dig deeper on this as there are real differences in institutional priorities and these differences can have a huge impact on the undergraduate academic experience.<br></li>
<li> What are the institution’s financial resources and does the college use them aggressively to deliver resources that benefit undergraduate students, eg, lower student/faculty ratios, smaller classes, student advising, career counseling, student support services, etc? Usually you would prefer a college with deep financial resources, including excellent financial aid (for both IS and OOS students) and the ability to fund all of the aforementioned which benefit undergraduates.</li>
</ol>

<p>In the last year, I have created a number of threads that evaluated each data point of the USNWR rankings. The objective was to put into perspective how colleges compare on a wide variety of individual datapoints without resorting to rigid and simplistic rankings. Hopefully, you will find some of this of value in your college search & selection.</p>

<p><a href=“http://talk.collegeconfidential.com/college-search-selection/614064-data-points-usnwr-2009-full-set-threads.html[/url]”>http://talk.collegeconfidential.com/college-search-selection/614064-data-points-usnwr-2009-full-set-threads.html&lt;/a&gt;&lt;/p&gt;

<p>If one were to rank those 50+ colleges and universities listed above (and I am sure I left out a few others that belong) based on any number of weighted criteria, the output could mislead one into believing that there is a serious gap between #15 and #45. There isn’t. In fact, there is probably no gap whatsoever. = Alexandre of Dubai</p>

<p>BRAVO! And the same applies to the next 50, and so forth.</p>

<p>Yes, there is a blunt instrument tier group. Yes, the quality of student in a classroom can affect the learning processes that occur there, but that is not absolute. I know of anecdotal cases where kids taking a required philosophy course who are not left brained, so to speak, and struggled…even though they are brilliant in other areas like accounting, math, biology e.g., did not necessarily mean that the kids who did get it and engaged the professor, had any less opportunity for learning…indeed, they took off like rockets. But I digress. </p>

<p>Rankings are a muse. An interesting reflection point. And perhaps a point of personal pride. Schools are not one size fits all and required to follow a narrow model of higher education set by those at the very top. The mission and experience of colleges can vary widely, and that is a good thing. </p>

<p>While one’s college experience becomes a part of who they are as individuals, and hopefully with more than just raw knowledge, but a value construct, a sense of community and giving back to others etc., it does not define us exclusively or label us with a branding iron for life as dumb, smart, brilliant, nerdy, whatever. Brilliant students exist at most every college. </p>

<p>The selection of which college you will attend, notwithstanding the quirky and narrow “selectivity ratings of each college”, is a personal decision based upon factors you alone (with your family) determine are most important to you and your long term objectives. </p>

<p>Its what makes this such a great country, of ours.</p>

<p>Phead128-
You are only repeating misconceptions about the Gourman Report that you have heard from others. Ties are possible in the Gourman Report and there is at least one that I have found. The Introduction to the Gourman Report contains considerable detail about his methodology. The Gourman Report rankings include programs under their subject headings that may go by different names at the universities. The Gourman Report rankings still have great validity and there is considerable agreement with other rankings and with the rankings posted by CCers. If the end result is valid, then the method is valid.</p>

<p>The only problem with Gourman is that he didn’t do a separate ranking for LACs. His rankings favor research universities.</p>

<p>I don’t know why some people are so quick to repeat misconceptions without looking into the validity of the rankings.</p>

<p>The Gourman Report is a good starting point for students looking for a research university.</p>

<p>The rankings at stateuniversity.com are interesting, in theory.
They are interesting because they claim to be based on “objective” criteria, not peer review (which is an important component of the USNWR rankings). The criteria they cite do seem reasonable to me.</p>

<p>The reality of their ranking results is problematic. Some of their individual ranks (for Yale and Penn for example) wildly defy the conventional wisdom. I’ve observed some serious factual errors in the school reports. Total cost at Middlebury College is shown as $3250 (wow, how can I get that deal?)</p>

<p>I think peer review of graduate departments can be reliable. It is reasonable to expect a professor to be aware of who the best people are at other universities in his field. When it comes to peer review of entire undergraduate programs, I’d be a little more skeptical.</p>

<p>Gourman is a fraud. Unlike the huge sampling undertaken in the Foreign Policy Magazine rankings, and the very specific question asked, and the methodology wholly transparent, Gourman refuses to reveal his methodology. We don’t know if he even set foot on the campuses he claims to be ranking, who he spoke with, whether or not he spoke to students. In addition, the so-called “research” upon which he based his opinions is almost 15 years old. It is especially amusing to see him rank non-existent departments - apparently, he didn’t even read the catalogs.</p>

<p>[Caveat</a> Emptor The Gourman Report January 2002](<a href=“http://www.siop.org/tip/backissues/tipjan02/07bedeian.aspx]Caveat”>TIP Online)</p>

<p>And if you believe Wayne State University is 30 places better than Williams and Swarthmore, well, more power to you.</p>

<p>mini-
I do not dispute the fact that Gourman overlooked LACs. I have posted the methodology elsewhere. not going to post the long section on methodology again.</p>

<p>Did he rank a non-existant program? I have not found one.</p>

<p>Lets select a specific major, I’ll post the rankings, then let’s discuss the merits.</p>

<p>

</p>

<p>I feel this is a major difficulty with a lot of these rankings. We all already “perceive” what colleges and universities are best and nearly every ranking system is designed to output results that closely match perception as a proof of their validity. You’ll always get a few anomalies, but the goal here is to produce a dependable list that outputs about what people expect. That’s why these rankings will always fail-- it’s nearly impossible to capture the full experience, impossible to rank schools for individuals who have very different preferences and needs, and whatever metric does come out with the mostly “right” answers will undoubtedly short-change some places and overrank others due to flaws in methodology that seem irreparable.</p>

<p>Rankings have “power” because they’re simple and people like simple. However, they’re also almost entirely useless because they are simple, and the undergraduate experience is quite complex.</p>

<p>That being said, I generally agree with Hawkette-- look at the individual data points as a stepping off point, then do your own research which will yield far more valuable information.</p>

<p>“The THES ranking places great emphasis on the results of a survey sent out to more than 190,000 researchers who list what they think are the top 30 universities in their field of research. Fair enough, but its flaw is that it is entirely opinion-based and has a response rate below 1% - and this may contain significant bias, the researchers said.”</p>

<p>[University</a> rankings don’t measure up | Education | guardian.co.uk](<a href=“http://www.guardian.co.uk/education/mortarboard/2007/nov/08/universityrankingsdontmeasu]University”>University rankings don't measure up | Higher education | The Guardian)</p>

<p>modestmelody-
Yes, but the validation process is reciprocal between perceptions and facts. Perception are (should be) based on facts. The accuracy or validity of “facts” are (should be) questioned if they disagree with perceptions. I wouldn’t discount common sense.</p>

<p>Rankings tend to be most reliable and uncontroversial at the upper end.
That’s because the very best schools tend to be excellent according to many different criteria. Whether you rank by SAT scores, faculty salaries, number of books in the library, or the age of the bricks in the building, the top of your list is very likely to include Harvard, Yale, and Princeton.</p>

<p>However, most students (including many very good students) do not have a realistic hope of being accepted to the USNWR top 5 or top 10. You have to consult many information sources before you can be confident that you are (a) not overlooking good choices from lower down in the pack, and (b) not being led up a garden path toward choices that are inappropriate for your own specific needs.</p>

<p>You need to be aware, too, that many rankings express more precision (finer distinctions) than the underlying data support. Remember what your math teacher taught you about significant digits. There should be many more n-way ties than we see in most of these rankings.</p>