<p>Can someone post it please.</p>
<ol>
<li> Harvard University 4.9</li>
<li> Massachusetts Institute of Technology 4.9</li>
<li> Princeton University 4.9</li>
<li> Stanford University 4.9</li>
<li><p>Yale University 4.9</p></li>
<li><p>California Institute of Technology 4.7</p></li>
<li><p>University of California-Berkeley 4.7</p></li>
<li><p>University of Chicago 4.7</p></li>
<li><p>Columbia University 4.6</p></li>
<li><p>Cornell University 4.6</p></li>
<li><p>Johns Hopkins University 4.6</p></li>
<li><p>Duke University 4.5</p></li>
<li><p>University of Michigan-Ann Arbor 4.5</p></li>
<li><p>University of Pennsylvania 4.5</p></li>
<li><p>Brown University 4.4</p></li>
<li><p>Dartmouth College 4.4</p></li>
<li><p>Northwestern University 4.4</p></li>
<li><p>University of California-Los Angeles 4.3</p></li>
<li><p>University of Virginia 4.3 </p></li>
<li><p>Carnegie Mellon University 4.2 </p></li>
<li><p>University of North Carolina-Chapel Hill 4.2</p></li>
<li><p>University of Wisconsin-Madison 4.2</p></li>
<li><p>Georgetown University 4.1</p></li>
<li><p>Rice University 4.1</p></li>
<li><p>University of Texas-Austin 4.1</p></li>
<li><p>Vanderbilt University 4.1</p></li>
<li><p>Washington University-St Louis 4.1</p></li>
</ol>
<p>Hey Alexandre can you go as far down as Emory? And would you be able to tell me what Miami's (FL) peer assessment score is? Thanks in advance.</p>
<p>Emory has a peer assessment score of 4.0 and Miami (FL) a peer assessment score of 3.2.</p>
<p>How does this peer assesment score work?
How is a peer defined and how do they come up with the number?</p>
<p>The peer assessment score is the average rating assigned to a university by the deans of admissions and presidents of similar universities. So you wont have the dean of admisions or president of a LAC (like Davidson or Haverford) or a regional university (like Santa Clara or Trinity) rating a national research university (like Northwestern or UCLA). There are roughly 200 LACs, 250 national universities and between 120 and 160 regional universities depending on the region. </p>
<p>From the USNWR:</p>
<p>"Peer assessment (weighting: 25 percent). The U.S. News ranking formula gives greatest weight to the opinions of those in a position to judge a school's undergraduate academic excellence. The peer assessment survey allows the top academics we consult—presidents, provosts, and deans of admissions—to account for intangibles such as faculty dedication to teaching. Each individual is asked to rate peer schools' academic programs on a scale from 1 (marginal) to 5 (distinguished). Those who don't know enough about a school to evaluate it fairly are asked to mark "don't know." Synovate, an opinion-research firm based near Chicago, collected the data; of the 4,089 people who were sent questionnaires, 58 percent responded."</p>
<p>peer assessment is really reflective more of a schools grad prestige than undergrad. Rice,Georgetown, Tufts, William & Mary, Wake Forest are all underrated compared to some of the larger State U's that focus mainly on Grad students.</p>
<p>It assumes that the deans of a 100 colleges are familiar with every other undergrad college...naturally, they aren't that informed.</p>
<p>Note that the Peer Assessment score is the only non-empirical value on the US News survey.</p>
<p>Peer assesssment is supposed to be a reliable yardstick. Of course, there is theory and there is practice.</p>
<p>
[quote]
Colin Diver of Reed did not hesistate to share his opinion on the PA:</p>
<p>"I'm asked to rank some 220 liberal arts schools nationwide into five tiers of quality. Contemplating the latter, I wonder how any human being could possess, in the words of the cover letter, "the broad experience and expertise needed to assess the academic quality" of more than a tiny handful of these institutions. Of course, I could check off "don't know" next to any institution, but if I did so honestly, I would end up ranking only the few schools with which Reed directly competes or about which I happen to know from personal experience. Most of what I may think I know about the others is based on badly outdated information, fragmentary impressions, or the relative place of a school in the rankings-validated and rankings-influenced pecking order." Source <a href="http://www.theatlantic.com/doc/200511/shunning-college-rankings%5B/url%5D">http://www.theatlantic.com/doc/200511/shunning-college-rankings</a>
[/quote]
</p>
<p>Add the self-reported dishonesty of colleges that give negative scores to their competitors and foes but reward their friends and ... sisters with abandon, and you end up with a completely unreliable and biased testament to cronyism and silliness. </p>
<p>Had USNews an ounce of integrity,they would separate the Peer Assessment in an individual listing. This would allow the fans (read fans of schools that have a MUCH better PA than their "numerical ranking would support) to have a list similar to what Alexandre posted, and the rest of the world could have a list or ranking based on objective criteria. </p>
<p>Will that happen? Hell no, because it would mean another drop in the completion of the survey, even lower than the sinking 58% of today. Eliminating one venue for manipulation might not be a very good idea for USNews!</p>
<p>There is only a 58% response to Peer Assessment?</p>
<p>
[quote]
There is only a 58% response to Peer Assessment?
[/quote]
</p>
<p>Yes, which is why I've been jumping up and down like a madwoman noting that the decision of Sarah Lawrence et al to no longer participate may not actually be that big of a deal.</p>
<p>I must be on the ignore list of about half this board, because I'm been jabbering on about the response rate with little effect. LOL</p>
<p>I'll also add this--it would appear that a lot of people don't read anything about the USNews methodology before diving in. I'm vlad people ask questions here so they can find out more. But it raises questions (for me) about the accuracy of people who claim that they use the rankings very judicously, and that they're certain most other people do the same. How many people really do read the fine print? Maybe less than we hope.</p>
<p>Some schools have a much better PA than their ranking would suggest because they have an outstanding faculty by most measurables. If it is important for the students to score higher on the SAT (which studies indicate has little to do with actual college performance), or alumni giving rates, it should be equally important to have a faculty respected by its peers. This can be measured in several ways--research funding and publishing, membership in scholarly academies, winning competitive awards/grants, etc.<br>
When you look at the top major universities using these critera you get basically the same ranking as with the PA. So the collective "ignorance" and even dishonesty still does a pretty good job of putting the schools in rank order. Now for LAC's it gets tougher as there are fewer common measurables out there but I believe the top ranked LAC's also have the best facultly using the indicators for research schools.</p>
<p>The thing is, most of the factors included in the ranking aren't directly related to the quality of eduction at respective colleges anyway. At least peer assessment, while subjective, is directly related to the purpose of the rankings. I'd argue that student selectivity is a valid predictor of academic quality (although I'd quibble with "acceptance rate.") But "Alumni giving?" "Class sizes?" Just how directly related to academic quality are those things? And are things like graduation rate and retention even remotely valid distinctions between top 20, or even top 50 schools? "Financial resources" includes money spent on research, which may or may not be relevant to undergraduate education, etc. A lot of the factors included are present based on the assumption that they will - indirectly - affect overall academic quality. But peer assessment is a direct opinion of the success of the school in actually achieving academic quality. Imperfect, but valuable in its own right.</p>
<p>peer assessment is one of the main reasons why schools are now resisting the usnews surveys. to have it as such a heavy weight in their rankings is ridiculous.</p>
<p>Hoedown:</p>
<p>Haha, I can't believe I missed all your comments. Everything you've said makes much more sense now. F</p>
<p>lol, OK, so that front page news article means that the % of respondents will go from 58 percent to...like 56%. Such a non-issue IMO.</p>
<p>"But "Alumni giving?" "Class sizes?" Just how directly related to academic quality are those things?" And are things like graduation rate and retention even remotely valid distinctions between top 20, or even top 50 schools? "Financial resources" includes money spent on research, which may or may not be relevant to undergraduate education, etc. A lot of the factors included are present based on the assumption that they will - indirectly - affect overall academic quality."</p>
<p>:)</p>
<p>"But peer assessment is a direct opinion of the success of the school in actually achieving academic quality. Imperfect, but valuable in its own right."</p>
<p>-Then I ask, who are the people NOT giving places like Harvard and Stanford scores of 5?????? What do places like that have to do to show their 'academic quality'?</p>
<p>And as far as the percent of schools that report back their survey information... I gave the number months ago, but people only see what they want..... :rolleyes:</p>
<p>I think peer assessment is very often messy but more often than not reveals some hard truths. (Kind of like those reality shows where the judges ask the contestants what they think of one another....very often we see they see things a little more clearly...if not at least differently)</p>
<p>If a school isn't known to a peer isn't that a bit telling? If a school is well known but not rated highly by it's peers shouldn't that be of value, even if taken with a grain of salt?</p>
<p>Perhaps the opinion holds value but if it should be taken with a grain of salt, it shouldn't hold 25% of the overall score.</p>
<p>"If a school isn't known to a peer isn't that a bit telling?"</p>
<p>-Yes, but only if the schools actually are peers. How exactly is 'peer' defined? Are administrators really supposed to know about all the schools on the lists? If not, then all you have is a group of friendly schools rating each other, and odds are to me that if you consider a school your peer, you'd tend to inflate its rating.</p>
<p>"If a school is well known but not rated highly by it's peers shouldn't that be of value, even if taken with a grain of salt?"</p>
<p>-So you're saying that there may be some value in the fact that there are some people out there who believe that MIT, Harvard, and Stanford didn't deserve scores of 5?</p>
<p>"it shouldn't hold 25% of the overall score."</p>
<p>-I like the idea of having a separate list. Sort of like saying: "here are the numbers" (even if many of the numbers are unnecessary) and "here's the PA."</p>
<p>I actually agree with KK on this one. I alway maintained that the PA and the statistics measure two very different things. The former measures reputation and pure academic strengths, the latter answers questions more pertinent to individual preference.</p>