<p>collegehelp,
Which data? There may be correlations but I don't believe the changes in the data over time are reflected in the PA scores. The only Top 50 college that has materially changed for the better in PA terms over the past 10 years has been USC which has gone from 3.7 to 4.0, still a far cry from Cornell's 4.6 despite the relative narrow differences based on the USNWR data.</p>
<p>UCBChemE,</p>
<p>I don't see how anyone can look at Hawkette's original post and not come away wondering about the non-responsiveness of the PA score to improvements in the rest of the USNews "package". At least, anyone interested in the reputation of schools that are non-Ivy or their close relatives.</p>
<p>PA is a lagging indicator. It's hard to dispel long held notions about a schools reputation.</p>
<p>^^^If it lags by many years, shouldn't it just be dispensed with?</p>
<p>^ No...it still has merit.</p>
<p>Just so I can understand the expectations/thinking of the PA defenders, what kind of changes would you need to see in ANY institution in order to see a change in the PA (good or bad)?</p>
<p>Well obviously you need to change the perceptions of the people that take the survey. I doubt they are looking at incoming freshman SAT scores.</p>
<p>Peer Assessment is made subjectively but it is possible to determine how much the PA subjective perceptions accurately reflect the hard data listed in the US News Best Colleges. Statistically, the PA is correlated about .9 with a combination of hard data factors. This is a very high correlation. This is based on 2008 US News data. If I had added more data to the equation, I probably could have found an even stronger tie between PA and current data.</p>
<p>Perhaps it is hard to see that almost all schools have improved. "A rising tide lifts all boats." as they say. So RELATIVE position has not changed much despite individual advances.</p>
<p>What is the exact nature of the questionnaire used for the peer assessment survey? Probably it is only sensitive to ordinal comparisons among colleges.</p>
<p>Has the actual survey used for the PA ever been posted or published? Exactly what are the characteristics being "measured"? (The quotation marks are my way of indicating that I don't believe sticking a number on something is the same as measurement.)</p>
<p>Hawkette, now this is my "subjective" opinion.</p>
<p>When I look at PA, I see schools that are either thought of as tops in the fields they teach, or are strong across the board in more areas than almost any other school. Their professors are thought to be at the top of their profession. Also, I think schools get credit for how many top students are at a school and aren't penalized by the lesser students.</p>
<p>As an example, if a person goes to Berkeley, almost every program is strong, and there are more strong programs than almost any other school. Despite not having the highest average SAT scores, Berkeley may have more top students than any other school.</p>
<p>Plus, the PA number is not a fact. The PA number represents opinions. You are free to disagree with it.</p>
<p>USNWR editor on the PA survey:</p>
<p>
[quote]
On 22 June 2007, U.S.News & World Report editor Robert Morse issued a response in which he argued, "in terms of the peer assessment survey, we at U.S.News firmly believe the survey has significant value because it allows us to measure the "intangibles" of a college that we can't measure through statistical data. Plus, the reputation of a school can help get that all-important first job and plays a key part in which grad school someone will be able to get into. The peer survey is by nature subjective, but the technique of asking industry leaders to rate their competitors is a commonly accepted practice. The results from the peer survey also can act to level the playing field between private and public colleges.
[/quote]
</p>
<p>USNWR Methodology Explanation: </p>
<p>
[quote]
Peer assessment (weighting:25 percent). The U.S. News ranking formula gives greatest weight to the opinions of those in a position to judge a school’s undergraduate academic excellence. The peer assessment survey allows the top academics we consult—presidents, provosts, and deans of admissions—to account for intangibles such as faculty dedication to teaching.
Each individual is asked to rate peer schools’ academic programs on a scale from 1 (marginal) to 5 (distinguished). Those who don’t know enough about a school to evaluate it fairly are asked to mark “don’t know.” Synovate, an opinion-research firm based near Chicago, collected the data; of the 4,269 people who were sent questionnaires, 51 percent responded.
[/quote]
</p>
<p>UCBChemEGrad,
I have read that many times. What do you think is a reasonable amount of time in which you would expect to see material changes for ANY colleges whose competitive position either improved or declined? </p>
<p>Now apply your answer to what has been documented here for a school like USC or one of the others with strong improvement. How much would you expect to see their scores improve, both absolutely and relatively?</p>
<p>Hawkette, from your kid's high school, are the students that go to Cornell and USC comparable academically?</p>
<p>How in the world can the provost of school A know anything about "faculty dedication to teaching" at schools B, C, D, E... Z? It's so ridiculous.</p>
<p>It is hard to believe that 49% of those surveyed actually think they could answer questions like that in an objective manner.</p>
<p>I have come to the conclusion that the PA survey is nothing more than a popularity contest. All those schools that wish to rise in the rankings should immediately start a massive PR and advertising campaign directed at those who are likely to get these surveys. Maybe include some of those recordings that include subliminal messages. Have people show up at conferences with assignments to button-hole eminent personages and drop hints about how dedicated the new faculty members are to undergraduate instruction.</p>
<p>Or maybe just go coastal.</p>
<p>PA is my least favorite factor in the USN&WR methodology.
But in the case of Washington University, it adds some sanity.
I have never been able to find a published report of WashU's Common Data Set and am suspicious of its reported SAT ranges. The school is savvy enough to have convinced USN&WR that it spends more money on its students than Princeton, Harvard or Stanford. That it is more selective than Stanford or Cal Tech. And that it has a better faculty than Yale, Stanford or MIT.
Do you think that if these things were even remotely true that it would have escaped the notice of its peers?</p>
<p>I'd also be interested in the response rates among the various types of colleges. Are the Ivies responding at near 100 percent levels? Also, the pedigree of the responders. Does a Tulane professor responder have a doctorate from Princeton? Academia is one of the most brand conscious industries out there. Those being surveyed are hugely in awe of the perceived elite. I would like to see a survey of consumers of the product. That would mean fortune 100s, I guess. Perhaps major journals (Science, Nature, JAMA, NEJM). Not sure how to do this but the current approach is terribly insular. Hate to suggest this because of the widespread hatred of the SAT, but maybe what's mostly needed is some sort of graduation exit exam that attempts to measure what the students have actually learned. Very gnarly to implement, I'm sure.</p>
<p>danas,</p>
<p>I think there is an incredible amount of judgment based on nothing more than location in the midwest and the south rather than one of the coasts. </p>
<p>I say that as a native midwesterner who spent most of her adult life in New England, and has a child attending a school in the south.</p>
<p>Go coastal. Improve your reputation. It really is that simple.</p>