<p>FWIW,
Below I list research universities roughly in order of how their various departmental faculties ranked in the 1995 NRC faculty quality surveys (41 core disciplines in humanities, social sciences, biosciences, math/physical sciences, and engineering), and compare that to US News PA ratings. As I expected, PA ratings correlate quite closely, although not perfectly, with the number of distinguished (#1, top 5, top 10, top 25) faculties at the school:</p>
<p>School (#1, top 5, top 10, top 25 faculties) PA
1. Berkeley (2, 23, 35, 36) 4.8
2. Stanford (5, 16, 31, 40) 4.9
3. Harvard (5, 20, 26, 30) 4.9
4. MIT (6, 18, 20, 22) 4.9
5. Princeton (2, 13, 22, 28) 4.9
6. Yale (5, 12, 18, 25) 4.8
7. Michigan (1, 7, 14, 35) 4.5
8. Cornell (0, 6, 19, 31) 4.6
9. Chicago (1, 7, 17, 29) 4.6
10. UCLA (0, 4, 15, 34) 4.2
11. Columbia (1, 7, 13, 29) 4.6
12. Wisconsin (0, 2, 14, 33) 4.1
13. Penn (0, 3, 15, 31) 4.5
14. Caltech (3, 8, 12, 18) 4.7
15. UIUC (0, 4, 10, 23) 4.0
16. Texas (0, 1, 7, 28) 4.1
17. Duke (0, 5, 7, 19) 4.4
18. Minnesota (1, 2, 5, 22) 3.7
19. JHU (0, 0, 8, 22) 4.6
20. Northwestern (0, 1, 6, 18) 4.3
21. UNC (0, 0, 3, 19) 4.2
22. UVA (0, 1, 5, 16) 4.3
23. NYU (1, 1, 2, 15) 3.8
24. Brown (0, 0, 2, 17) 4.4
25. WUSTL (0, 0, 3, 11) 4.1</p>
<p>Not in top 25:
Emory (0, 1, 1, 5) 4.0
Vanderbilt (0, 0, 2, 4) 4.0
Georgetown (0, 0, 0, 1) 4.0
Notre Dame (0, 0, 0, 4) 3.9
Rice (0, 0, 1, 6) 4.0
CMU (0, 1, 1, 8) 4.2</p>
<p>Some caveats: First, I did this late at night and could have made some clerical or transcription errors; I welcome you to check and correct the data, especially at the lower end of my ranking where it's possible I missed a school, say a a Rutgers or a Pitt, that should have slipped into the top 25. Second, the NRC rankings are now quite old; new ones are due out in the fall, and are eagerly anticipated. Third, this survey represented 41 disciplines; schools could have outstanding strengths in other fields that go unrecorded here. Fourth, the sheer counting exercise obviously disadvantages universities that don't have engineering, for example (8 disciplines), as well as science-and-engineering schools like Caltech that would not be expected to have strengths in the humanities (10 disciplines) or social sciences (7 disciplines); note, however, that MIT did extremely well overall with some strengths in humanities and social sciences.</p>
<p>Despite these caveats, I do think the data are quite revealing. They show a pretty close but imperfect correlation between PA scores and the more detailed, discipline-by-discipline peer assessments of faculty strength done in the NRC rankings. Both are ultimately subjective measures, though of the two I much prefer the NRC ranking because it asks knowledgeable people in the discipline to rank faculties in their own discipline. That the correlation with PA is as close as it is confirms my intuition that PA is at least a rough proxy for how a school's faculties are seen by their peers in the academic community.</p>
<p>A second observation is that faculty strength as measured in the NRC rankings is highly concentrated at the top: CHYMPS (with "C" here for "Cal") utterly dominate the #1 and top 5-in-their-field positions, followed by a second tier led by Michigan and Cornell and extending down through the Penn-Caltech range with most programs in the top 10 or top 25, but far fewer #1 or top 5. From there it's a pretty steep drop to third tier consisting of Illinois-Texas-Duke-Minnesota-JHU with a respectable number of top 10 and top 25 programs, then quickly downward after that. </p>
<p>Now of course, the anti-PA people will say this only proves what they're been saying all along: PA rating is a proxy for GRADUATE program quality. After all, the NRC ranking a ranking of graduate programs. But I've tried to isolate here the NRC measure of "faculty quality," believing as I do that a strong faculty in a field is a strong faculty in that field, whether you're at an undergraduate or a graduate level. Things like student-faculty ratio, class size, and percentage of classes taught by non-tenured/tenure-track faculty do obviously matter for the quality of undergraduate education, but those things are measured elsewhere in the U.S. News rankings---measured poorly, perhaps, but they're there. But I maintain if you really want to do philosophy (for example) at a high level as an undergraduate, you need to work with the top philosophers; it's always been that way, and always will. To some extent they're spread around, but by and large they're concentrated on the strongest philosophy faculties. Knowing where they are matters in evaluating an undergraduate institution's strength; it's only one factor among many, but it's an important one.</p>
<p>Notice, however, that the correlation between NRC faculty strength and PA is not perfect. By and large, the big public universities have US News PA ratings slightly below their NRC faculty strength rankings. And many privates, including some of the non-HYP Ivies, have PA ratings a little higher than their NRC faculty strength rankings. I take it this suggests there's some discounting of the publics' PA ratings to account for large class sizes, student-faculty ratios, etc ; and some PA "bonus points" awarded to schools like Brown for their alleged high teaching standards, even though on a peer assessment of faculty quality by discipline their faculties do not excel. So I'd conclude the PA rating probably reflects administrators' summary assessments of overall faculty strength (largely research and scholarship-driven), subjectively adjusted upward or downward a bit based on factors like class size, student-faculty-ratio, and reputation for teaching excellence, all factors that go into the "teaching" component of PA but are notoriously difficult to measure.</p>
<p>Bottom line, I'd like to see the current PA score replaced by a more straightforward "faculty excellence" score much like the current NRC rankings, discipline-by-discipline. To my mind, this is valuable information, especially at the level of research universities, and even for undergraduates. Perhaps it's because I'm an academic and my own kids have a distinctly academic bent, but I want my kids studying with top scholars in their fields and being exposed to, and engaging in, the current, cutting edge research and intellectual debates in those fields; and if they're focused and motivated, they can certainly do that as undergrads, at least by the time they're upperclassmen. Teaching excellence is more difficult to measure; things like s-f ratios and class sizes get at it only imperfectly. Perhaps the best we could do is a survey of student satisfaction with the teaching (like ther one Hawkette introduced above), which in my judgment is a poor proxy for the actual quality of the teaching because it's often the best entertainers who get the best student evaluations; but at least it would tell us how satisfied the students are, and that's something.</p>
<p>But in the meantime, PA does tell us something.</p>