Is Peer Assessment in USNWR Rankings based on Undergrad or Grad Reputation?

<p>
[quote]
People who make the PA ratings are asked to base their ratings on the quality of undergraduate programs. The statistical analysis shows that 94% of the PA can be explained by the hard data listed in US News Best Colleges such as graduation rate, student-faculty ratio, SAT scores, acceptance rate, alumni giving rate, and so on. When I add ratings of graduate programs from the NRC database, the combination of US News hard data and NRC ratings of graduate programs adds about 6% to the explanatory power of the US News hard data alone.

[/quote]
If you find the U.S. News scores without PA, you essentially have the hard data alone. When you find the R-squared between this hard data and the PA scores, you don't get anything close to 94%.</p>

<p>Igellar-
Check out this thread. It explains the model that yields a high R-squared.</p>

<p><a href="http://talk.collegeconfidential.com/college-search-selection/412606-how-calculate-universitie-s-peer-assessment-score.html?highlight=calculate+peer+assessment%5B/url%5D"&gt;http://talk.collegeconfidential.com/college-search-selection/412606-how-calculate-universitie-s-peer-assessment-score.html?highlight=calculate+peer+assessment&lt;/a&gt;&lt;/p>

<p>Texas is rated higher than Tufts, Wake Forest, Notre Dame, Rice, William & Mary, and Georgetown for PA. That is a joke.</p>

<p>
[quote]
As is obvious by Hawkette's numbers, PA is more biased toward grad programs. Wake Forest, William & Mary, Rice, Tufts, Georgetown, Notre Dame, Vanderbilt, Emory, are all ridiculously underrated, esp compared to their peer groups. </p>

<p>Its a shame b/c high schoolers and families are the ones buying the mag and the info is set up for prospective grad students.

[/quote]
</p>

<p>I actually have to agree with this statement -- sorta.</p>

<p>I'm not sure exactly how peer assessment is calculated, but I'm under the belief it's based more on overall reputation of school (and not really based on either undergrad or graduate). </p>

<p>A lot of well regarded state universities (Wisconsin, Illinois, Texas) have fairly high Peer Assessment scores because they have strong faculty research programs and great graduate school programs. However, they have their minuses (mainly large classes, high faculty-to-student ratio, probably a higher percentage of less ambitious students). Yet these schools have their strengths (strong faculty, strong potential for meaningful research).</p>

<p>Schools like Rice, Wake Forest, Emory, Notre Dame are much better for undergraduate students than graduate (in my opinion). These schools are smaller in size, have better opportunities to interact with faculty (even if they may not be as well known as the larger schools), and a tighter knit community (a higher proportion of ambitious students). </p>

<p>Honestly, if I had a choice between Rice and Texas for undergrad, Rice is a clear winner. But Texas is likely a better school for a PhD.</p>

<p>I don't think they are underrated by peer assessment at all.</p>

<p>Vanderbilt and Emory are probably overrated. They lag in freshman retention and graduation rate, two very important quality factors.</p>

<p>Georgetown lags in faculty resources and financial resources. I don't think Georgetown is strong in sciences and math.</p>

<p>Wm and Mary lags in faculty resources and especially financial resources. </p>

<p>Wake Forest lags in selectivity. They enroll a relatively small percent of freshmen in the top 10% of their high school class.</p>

<p>Tufts and Notre Dame have faculty that are relatively weak in scholarship. </p>

<p>Rice, however, might deserve a higher PA....maybe 4.3 instead of 4.0.</p>

<p>Vanderbilt and Emory are overrated because they lag in their freshmen retention and graduation rates????? What are you smoking? Let's compare to a few colleges with much higher PAs.</p>

<p>PA SCORE
4.6 Cornell
4.5 U Michigan
4.0 Emory
4.0 Vanderbilt</p>

<p>FRESHMAN RETENTION
96% Cornell
96% U Michigan
94% Emory
96% Vanderbilt</p>

<p>4-YEAR GRADUATION RATE
84% Cornell
70% U Michigan
82% Emory
85% Vanderbilt</p>

<p>6-YEAR GRADUATION RATE
92% Cornell
87% U Michigan
88% Emory
91% Vanderbilt</p>

<p>The only place where there is a substantial difference is PA score. There may be a reason for this in the minds of academics, but I highly doubt that it is due to differences in freshmen retention and graduation rates. </p>

<p>As for your other comparisons, if you consider Faculty Resources as a disqualifying factor for Georgetown (USN rank of 38th) and W&M (46th), then please explain how this impacts U Michigan (69th). And why doesn't this factor help Emory and Vanderbilt (tied for 10th)!! Again, I highly doubt that this is a factor in the minds of academics as they make their PA judgments. </p>

<p>And if you don't like Wake's level of selectivity due to its Top 10% number of 63%, then how about U Wisconsin and/or U Illijnois with its 58%/55% of students in this category and yet each has a 20% higher PA score than Wake. Are academics considering this in assigning their PA scores?? I doubt it. </p>

<p>Finally, Rice. I think we all agree that Rice gets poor treatment in the land of academia. The interesting question, of course, is why?</p>

<p>The definition and application of class rank varies so much from school to school and state to state it's hardly useful. The average UW GPA for Wisconsin and Michigan is 3.77 and 3.75 respectively but UW reports 60% in top 10% and UM 90%. Something seems off.</p>

<p>
[quote]
The definition and application of class rank varies so much from school to school and state to state it's hardly useful. The average UW GPA for Wisconsin and Michigan is 3.77 and 3.75 respectively but UW reports 60% in top 10% and UM 90%. Something seems off.

[/quote]
</p>

<p>Barrons is correct here. </p>

<p>The %age of top 10 percent of an entering class cannot be viewed as a standalone metric. One needs to consider the admission agenda and mission of schools, especially public schools, and cosnider the OOS students. </p>

<p>In Texas, for instance, the top 10% rule all but ensure a high number of top 10% in each class. Actually, they could easily have a 100% if such yardstick were important. However, it's also a fact that the very best students in Texas attend schools that do NOT rank at all. That is why it is interesting to compare the (detailed) admission data of Texas and Rice! </p>

<p>Of course, the admissions policies at the University of California contribute directly to the percentage of top ten students. With an extremely low percentage of OOS, this index cannot be compared to schools that recuit a much large percentage on a national basis, and neither does the top 10% statistics of Berkeley have much relation to say HYP's. </p>

<p>Fwiw, the same can be said for GPAs since they come in all shapes and flavors and are absolutely not comparable on a state by state basis. One only needs to remember that the (self-reported) GPA of students taking the SAT shows that more than 40% graduate with a 4.0 average. Soon, we'll have to believe that students either dropout of high school or graduate with a 4.0! The beauty of misleading statistics.</p>

<p>hawkette-
There are also schools that have graduation and retention rates similar to Vanderbilt and Emory but LOWER PA ratings (e.g. Brandeis). It all depends on which schools you choose for comparison.</p>

<p>The PA is a wholistic index of overall quality. You have to consider schools wholistically when you discuss PA. PA is related to many things.</p>

<p>What I am saying is that Emory and Vanderbilt (and the other schools) have an "achilles heel" that might be affecting their PA rating. If schools like Cornell and Michigan have higher PA but similar graduation rates, it is because there are other offsetting factors.</p>

<p>Once again, we need a refresher on what PA score is supposed to be measuring...</p>

<p>It is measuring "distinguished academic programs" (majors).</p>

<p>Universities that have higher PA scores have greater breadth and depth in the majors they offer and have more distinguished/renowned reputation for those majors.</p>

<p>PA may correlate to other measures on the USNews survey, but it's not intended to measure the same things.</p>

<p>"The definition and application of class rank varies so much from school to school and state to state it's hardly useful. The average UW GPA for Wisconsin and Michigan is 3.77 and 3.75 respectively but UW reports 60% in top 10% and UM 90%. Something seems off."</p>

<ul>
<li>I think a lot of HS use weighted GPAs to calculate class rank.(APs and Honors count more)</li>
</ul>

<p>
[quote]
The PA is a wholistic index of overall quality. You have to consider schools wholistically when you discuss PA. PA is related to many things.</p>

<p>What I am saying is that Emory and Vanderbilt (and the other schools) have an "achilles heel" that might be affecting their PA rating. If schools like Cornell and Michigan have higher PA but similar graduation rates, it is because there are other offsetting factors.

[/quote]
</p>

<p>Come on! Can we please drop this entire silly proposal that there is ANYTHING scientific about the PA, starting with that regression analysis that supposed to "explain" 96% of the data. Of course, since the PA has become self-fulfilling prophecies for the lowly administrators and long-nailed secretaries who fill that darn survey on behalf of their bosses by copying last years' numbers, it **has **to show "some" correlation. </p>

<p>In the meantime, here is a very simple point: by looking at the PA of Berkeley or a LAC such as Harbey Mudd, one cannot find any signs of correlation between the objective data and the PA score. The differences are simply to be chalked to abject manipulation, geographical and historical cornyism, and total disregard for the INSTRUCTIONS that the survey is supposed to be about undergraduate education, and cover items such as dedication to TEACHING. </p>

<p>So, let's yap more about Achille's Heels at Emory and Vanderbilt but ignore the lack of objectivity of specific examples that are simply not ... representative of the "regression analysis." </p>

<p>Explain WHY Berkeley is so high? Explain why Harvey Mudd (the most selective LAC has a much lower PA than Smith College that has one the lowest selectivities and trails in about every index?</p>

<p>Outliers matter here!</p>

<p>I never heard of grade weighting until here. I think it is common in some states and uncommon in others. Even district to district within a state.</p>

<p>For my post, I was less concerned with the fortunes of Emory and Vanderbilt than with the statement there was a relationship between graduation/retention rates and PA scores. I highly, highly doubt it. </p>

<p>Same with Faculty Resources in the cases of Georgetown and W&M. I would highly doubt that Faculty Resources are a key metric that academics are looking at when they assign PA scores. </p>

<p>PA can't really be explained except as a highly subjective number assigned based on a historical perception and a pecking order that academia has ordained. The problem, of course, is that there are now more and more excellent colleges in America with superb faculty, but the PA does not recognize this. Yet, does anybody really think that there has not been a closing of the difference from the top to the middle of academia in the last 20-30 years?</p>

<p>If you really want the answer you can find it in the numerical data for faculty that include NAS members, number of major awards won, US News and NRC faculty rankings and the like. It will correlate well with PA score. But then you won't have anything to rail about.</p>

<p>barrons,
If USNWR clearly stated that the PA score should be based on such things and a strict objective approach were adopted, then that would be far more acceptable than the current "method." I might not like it, but it would have a much firmer base of supporting evidence, and would eliminate all of the guessing/conspiracy theories about how these numbers are arrived at. </p>

<p>Do you really think that the differences in faculty quality at #10 ranked university vs #50 ranked university in 1980 or 1990 is the same as it is today in 2008? Everything in American life is subject to change, it seems, except for how academia sees itself.</p>

<p>What I don't get is that this dopey magazine came up with the measure, came up with the weighting to give it, and decided how to describe it to its readership.</p>

<p>And yet you regularly heap blame on academia. it's how academia sees itself, it's the pecking order they've ordained, a historical perception they can't get rid of, etc. </p>

<p>It seems to me you have two problems. One is that you don't like how respondents have rated certain colleges. The other is that USNews puts a lot of weight on those responses. These aren't the same thing.</p>

<p>Barrons is right. IMO, PA is more a score for faculty award/acheivement and program reputation. It is these visible awards that academics look for...
How can all colleges have "distinguished" programs? If all had the same credentials what would make them distinguished? </p>

<p>Here is Berkeley's current faculty list of awards/honors:
American Association for the Advancement of Science Fellows 196
American Philosophical Society 36
American Academy of Arts and Sciences Fellows 227
A. M. Turing (computing) 3
Fields Medal in Mathematics 3
Fulbright Scholars 76
Guggenheim Fellows 358
Howard Hughes Medical Institute Investigators 11
Institute of Medicine members 11
MacArthur Fellows 28
National Academy of Education 8
National Academy of Engineering 85
National Academy of Sciences 132
National Medal of Science 14
National Poet Laureates 1
National Science Foundation Young Investigators Awards 61
Nobel Prize 7
Polk Award in Journalism 2
Pulitzer Prizes 4
Sloan Fellows (young researchers) 87
Wolf Prizes in agriculture, mathematics, chemistry, physics, medicine and the arts 4 </p>

<p>I suspect Berkeley's high PA is tied to these statistics, especially if compared to lower PA scoring universities. These are the highest awards/honors that academics respect, and academics will rate universities higher with faculties that hold these honors...it's part of what makes a university's academic programs "distinguished"...</p>

<p>


</p>

<p>My guess is we'll see some significant shifts in NRC rankings of faculty quality when the new rankings come out in the fall. The old NRC rankings are 13 years out of date. To my mind this is a better indicator of faculty strength than USN's PA rating because it's based on a fine-grained, discipline-by-discipline assessment. </p>

<p>Faculty quality does tend to be somewhat self-perpetuating because universities tend to make strategic investments to protect and enhance the quality of their strongest departments which lift up the entire university's reputation. For their part, star academics tend to prefer to be on the top faculties because they like the prestige and the enhanced professional opportunities that follow from working among the best and brightest in the field: constant exposure to cutting-edge research, opportunities to attract the best graduate students and research assistants, enhanced collaborative research possibilities, enhanced professional networking, frequently better labs, library collections, and other tools of the trade. That said, faculties do rise and fall over time, and my guess is we'll see some major shake-ups. Also, over time, shifts in NRC rankings will likely filter down into changes in PA ratings, a much more impressionistic metric, and one that will lag the NRC rankings by probably several years.</p>

<p>
[quote]
Igellar-
Check out this thread. It explains the model that yields a high R-squared.</p>

<p>How to calculate a universitie's "Peer Assessment" score

[/quote]
Yes, I'm looking at your regression model now. You used very little of the "U.S. News hard data" that you claimed earlier in the thread you had used to create the regression model.</p>