Are there differences in faculty quality & access to them among the USNWR Top 30?

<p>Hawkette-The erroneous info that I am referring to is that, in your opinion, PA is "bunk." While I am quite sure that there are points of difference when assessing PA, and that it is perhaps not a perfect number, at least it is a number assigned by educational professionals...people who are deemed knowledgable enough to give a credible opinion. You refuse to see that your opinion is not based on anything credible by comparison. My problem with your constant harping on this issue has nothing to do with individual schools. It has to do with tearing down the trust of the readership in those estimations. Don't you see that those opinions have far more validity than any lay person's? And, believe me, my profession notwithstanding, I do consider myself amongst the lay population when it comes to assessing top colleges and universities. I do trust that those people who are asked to make the peer assessments know far more than I, and I therefore have reasonable faith in their insights. Do you know that, when ranking all the top schools nationally, the European ratings give 40% weight to PA? It seems that here in the US, some of us are losing complete faith in any sort of authority. </p>

<p>The impression that you give is that you are an authority, when I don't believe that you are. Are you? Is there something you are not revealing to the readership? There are impressionable kids and nervous first time college- bound kids' parents devouring the info on this site. Should they believe the professionals? Should they trust the professionals? Clearly, you are advising them not to. This is what I refer to as erroneous information. You are obviously quite smart, which comes across in your posts. That is precisely why the sort of bias you are putting out there seems so dangerous to me.
I absolutely do not begrudge you your opinion. What I am concerned about is the constant emphasis you place on influencing the readership not to trust the people who should know.
A curious question. Do you have children in college?</p>

<p>gabriellah,
You place faith in the PA number when even many members of the academic community question its value. You keep saying that academics should know better than the rest of us, but certainly you are aware that many have publicly stated their misgivings about the PA survey and its many problems and shortcomings. The academics themselves don't even think it is a credible number (although the ones who benefit from the system aren't ready yet to throw it overboard). You ask should the high school student or his/her family trust the PA? IMO, they should not with one important exception. They should trust it if someone can accurately explain what it is measuring and who is responding. Then the views of the academics would have value and we would be able to interpret its meaning. But the current PA form falls woefully short of this standard. </p>

<p>As a believer in objective data (which ironically is usually the case with most academics), I put little faith in subjective opinion WHEN NO STANDARD OF MEASUREMENT IS PROVIDED. Subjective opinion for the interpretation of objective data can be useful, eg, interpreting what the numbers mean for Top 10% students at a school, public or private. Or how useful the information on alumni giving is in making a college ranking or a college selection. Neither you nor I know what the academics mean with the PA rankings individually or collectively and we don't even know which academics responded. </p>

<p>Like you, I am a lay person who has a view. Many others agree with my view while others, like you, believe in the delivered wisdom of the PA. I am willing to let the arguments be made and let each person decide for themselves on how valid and useful PA really is.</p>

<p>To get to the title of this thread, which concerns faculty quality- the NRC survey addresses that quite well, at least for the research universities. It is not just a popularity contest.</p>

<p>Access to faculty really requires on-the-ground information. Not all star professors are good at teaching undergrads, or are willing to try. I don't know of any way to derive this information from data such as how much they are paid, or the ratio of faculty to students. You really need to talk to people who know what it is like at that particular school..</p>

<p>I do think academics know a lot about what goes on in their competitor colleges, particularly within their fields. So if you know you are interested in economics, then a professor of economics at one college could probably tell you a lot about the undergraduate experience in ec at the places with which her college competes for faculty and students. The president of the university might have a general idea of how they stack up on student yield and cross admits, and a fairly detailed knowledge about faculty recruiting.</p>

<p>Hawkette, I am also a data-driven type. But that is why it puzzles me that you keep declaring that certain universities are academically equivalent to others that are ranked higher in PA, without ever saying what data you are using to come to those conclusions. You don't like PA, fine. Show me the academic quality indicators you are using. Citation frequency? Grant support? Major faculty awards? What?</p>

<p>45 percenter,
One other point on Rice and its high IS student population (46%). Stanford has 44% IS and that hasn't hurt it at all. I just think that Texans get a bum rap from the educational establishment (though truth be told, I don't think that many Texans are losing sleep over what the academics think about them).</p>

<p>afan,
I think you make a lot of good points about the level of knowledge that academics have about their competitor colleges. It is reasonable that they would have pretty decent knowledge of their own departments at other places. Does that place them in a position to judge an entire university? I would conclude not, but at least some aspect of their opinion would be informed. Yet please also realize that the folks being asked are the administrative offices of the school whose knowledge of individual departments is likely much less. They may be better placed to make an overall judgment, but I would bet that many (most?) would privately concede that their true lievel of knowledge about competitors is relatively superficial. Some have even made this point publicly as the PA debate goes on in the academic community.</p>

<p>Re your question about methodology, I am expressing an opinion based on my interpretation of the data and information that the colleges produce, personal experiences with many of the colleges and discussions with students and, more prominently, employers about various schools.</p>

<p>Rice has 16 NAS members and won 12 major faculty awards in the last year measured (2004). Princeton had 90 and 25 respectively. PU's higher PA score seems well deserved. Emory has 17/16 and most NAS members are medical. Vandy is at 17/24. Wash U has 44/29. Cornell 64/36</p>

<p>

As I believe it's been pointed out by someone else in one of these threads, you have to be careful comparing ACT scores with the schools in the Northeast (e.g., Ivies). Very few applicants to those schools submit ACT scores compared to the number who submit SAT scores, so the ACT score ranges for those small numbers may not have much statistical significance.</p>

<p>I have said my peace on this issue. I just sincerely hope, for the sake of the kids for whom this is so important, that they seek the advice of the people with the knowledge and credibility to advise them. My only advice to those kids is to seek to get a good opinion, and not listen to some position, taken for who knows what reason, by an anonymous contributor, whom you really know nothing about. The best advice for anyone applying to any school is to make sure that you are getting the best unbiased opinion about what you are doing from someone you have faith in. Please do not listen to strangers with possible agendas and/or biases. This is all I have to say on the topic.</p>

<p>So Hawkette, you do not perform the detailed analysis that you require of the academics who respond to the PA survey. Instead, you apparently do the same thing they do, basing your impression on unspecified general information. In this, apparently, you are helped by not being an expert on higher education. Yet, somehow you conclude that you are correct, and the consensus of a couple thousand academics is wrong. Such modesty.</p>

<p>Barrons,</p>

<p>Interesting information. NAS membership is simple to interpret. What did you use for "major" faculty awards?</p>

<p>Afan I think you nailed exactly what bothers me about the anti PA argument. I've never understood why PA is so scorned but selectivity is held up as absolute proof of a school's worthiness. If one student body is as strong as another's, why should anyone seriously interested in 'rankings' not take into effect all the fluff that determines selectivity?</p>

<p><a href="although%20the%20ones%20who%20benefit%20from%20the%20system%20aren't%20ready%20yet%20to%20throw%20it%20overboard">quote</a>.

[/quote]
</p>

<p>This is an ongoing implication (sometimes it's an outright accusation) that you make--that the academics that Synovate surveys are working "behind the scenes" to influence the way USNews does the rankings. I'd like to see you back this statement up. USNews is a separate, for-profit entity that makes its own decisions about the rankings processes, what is included, and how each item is weighted. Universities may try to have a say in that, but it is ultimately up to USNews. Why do you give presidents, provosts, and admissions directors credit for its inclusion? What measures are they taking to preserve it?</p>

<p>hoedown,
My reference was to the brouhaha this past year about schools challenging the PA survey aspect of the USNWR rankings. Many top schools have expressed sympathy for the position, but none have signed on to the boycott. That's all I was referring to.</p>

<p>afan,
I appreciate your comments and your arguments perhaps more than you give me credit for, but nonetheless, if I were ever to make my own ranking or PA ranking, at least I would explain what my process would be and I would be more than happy to disclose my participation. </p>

<p>I have actually done something similar to this with a thread a few months back on creating an alternative ranking. In case you missed it, here is the link:</p>

<p><a href="http://talk.collegeconfidential.com/showthread.php?t=352886%5B/url%5D"&gt;http://talk.collegeconfidential.com/showthread.php?t=352886&lt;/a&gt;&lt;/p>

<p>With specific regard to your charge that I am right and the academics wrong, that has not been the path of my arguments or thinking (though likely could be interpreted that way). I have, however, pointed the various flaws in the PA system which I (and many, many others) continue to believe are highly legitimate criticisms. Many of the grading patterns appear suspect to me and other consumers and even to some in the academic community. What I am asking for is greater clarity and also a enlarged universe of those passing judgments on the work of a faculty and what a student can expect when he/she gets to a college campus (I suggest adding NSSE survey or something like it and some type of employer survey, perhaps even on a regional basis). My motives are not nefarious, but rather very similar to the well expressed thoughts of gabriellah above. It is just that she and I come at this issue from different perspectives. </p>

<p>Finally, in response to ramses 2's post about selectivity, this is a very small factor in the USNWR survey (1.5%) and I agree that it has marginal benefit for relatively small differences in acceptance rate or for schools where the applicant pool is thought to be self-selecting, eg, U Chicago.</p>

<p>Hawkette, selectivity may be a small component but it probably does more to cloud the search for best fit than any other part of the Rankings. Every parent and child pays far more attention to that rate than PA, stats, everything else. It implies best when best fit is really more important. How many times have we seen kids with lists that have only selectivity in common? Those are kids who have given little thought to their education, they just want to be considered 'best'.</p>

<p>
[quote]
I suggest adding NSSE survey or something like it and some type of employer survey, perhaps even on a regional basis

[/quote]
</p>

<p>I think NSSE has a lot of potential value, but it bears some of the same limitations that earns your ire when it comes to peer assessment. </p>

<p>For example, I don't think it's very clear to many consumers what the "benchmarks" specifically and exactly measure, and I doubt many people go and look at the individual questions or ask whether factor analysis was used and whether responses were weighted. Therefore NSSE doesn't offer the "transparency" you're demanding from PA, at least not without a lot of digging through technical notes. And yet I believe NSSE scores can still be useful to students and families. It seems inconsistent to me that you herald some kinds of opinion while you dismiss PA for technical reasons that are no different from the shortcomings offered by other subjective opinion instruments.</p>

<p>hoedown,
I agree with your points above about NSSE. Rome wasn't built in a day and no doubt developing any kind of effective survey of students (and alumni and employers if those ideas were ever included) would take some real effort and probably multiple iterations to get it done right. But I don't think we should expect to solve all the problems on day one. If the idea has merit, then begin the process and adjust as you learn more about what it is telling you, what it is not telling you, etc. </p>

<p>joshua007,
We're now on page 5 and over 70 posts and your first contribution is an attack. Welcome to the discussion. </p>

<p>I'm not sure what "reality and actuality" you are referring to but I hope you will elucidate</p>

<p>If you'd like to engage on substantive issues, then I am perfectly willing to do so. And if you feel that you are able to make an effective argument without facts, then I look forward to reading it.</p>

<p>Of course we all know that Joshua's opinions are more important than facts and statistics</p>

<p>"My reference was to the brouhaha this past year about schools challenging the PA survey aspect of the USNWR rankings."</p>

<p>While some schools are objecting to the PA aspect, a huge majority of schools object to the sway the <em>entire</em> USNWR ranking has on admissions. I have heard this objection from faculty/administrators in the Ivies, top 50 national universities, AND the top LACs. The schools lower on the rungs of selectivity don't object as strenuously because they know that their numbers come more from word-of-mouth, guidance counselor recommendations, and regionalism. Really, the rankings, and thus the PA, ONLY affect the top colleges and universities. </p>

<p>I have stated before that the PA is a valid measure of prestige and therefore the perceived value of a specific degree outside of the college itself. Of course, taken alone, it offers little; however, combined with other factors, it add a much needed dimension. Ideally, none of it would matter much, but, with students arriving for campus tours clutching USNWR, the PA gives a much broader view of individual schools.</p>

<p>momwaitingfornew,
I agree with your comment about how academics feel generally about the USNWR survey. No one likes being judged and ranked and potentially embarrassed in front of a national audience. Nor does any college president like to deal with alumni or students or the college's board who are busting on him/her about the ranking and why isn't it higher? I'd be surprised if they liked it. </p>

<p>Having said that, much of the data that USNWR draws from is already in the public domain via the CDS. Thanks to USNWR, students and families now have a place they can go to get the data that is important to them (and it varies from family to family). Yes, they are a profit-making enterprise and their rankings can cause controversy (which websites like CC could not live without), but they package it all so well and I think we are all better off for having USNWR than not.</p>

<p>As for your comment about PA,</p>

<p>"PA is a valid measure of prestige and therefore the perceived value of a specific degree outside of the college itself"</p>

<p>this may be true, but do we know that is what the graders had in mind when they assigned their marks. More importantly, this is the perspective of one (anonymous because we don't know who responded) group. What about the opinions of the other stakeholders-students, alumni, employers? If I were a high school student looking at colleges, PA means very little to me if it is concerned with things like NAS awards or how many times a certain professor got published in an obscure academic journal. What I care about is what do the other students think of Professor Jones and his/her ability to teach a class? Or how well does ABC company think of our faculty and their ability to effectively prepare me for a postgraduate career? These are legitimate and really important questions that PA does nothing to answer.</p>

<p>"this may be true, but do we know that is what the graders had in mind when they assigned their marks."</p>

<p>It doesn't matter what they "had in mind" exactly. They are rating their peers. Trust me, every academic knows where the best students and colleagues are, although it would be indeed difficult to assign an exact pecking order. Start throwing in a few others, and tiers of ten or so begin to emerge. It's the exact <em>ranking</em> (as opposed to grouping) that causes problems, but that's what the other stats take care of -- supposedly.</p>

<p>"What about the opinions of the other stakeholders-students, alumni, employers?"</p>

<p>Students and alumni opinions don't matter a whit when it comes to ranking colleges. They don't have a broad enough experience to compare. Employers would be a different matter, although they are in no position to judge the scholarship and teaching environment of particular schools. Their knowledge lies in the desirability of graduates within their field, not in a more general understanding of higher education. For example, a business might hold graduates of NYU's Stern in high regard, without knowing that the rest of NYU (with the exception of Tisch) is known for being much weaker. (Apologies to NYUers, but this does indeed influence NYU's placement in the rankings.) You forget that universities send a sizeable percentage of graduates to graduate school instead of directly into the workforce. Where they end up often reflects peer assessment -- and the end result, the professional degree, is what most major employers see these days. To say employers are an important part of peer assessment also neglects majors other than the purely professional. Employers might be able to rank engineer and business students, but they most likely are not knowledgeable about French, psychology, and history majors. They probably aren't even aware of the style of teaching at a given school.</p>

<p>Because academics look for jobs at a necessarily wide base of universities, because they collaborate with colleagues at other institutions, because they read papers and talk to professors in departments other than their own, because they interact with colleagues at national conferences, because they see the quality of graduate students applying to their university from peer institutions, they are in the best position to judge the quality of other schools. Its a misconception that professors and administrators teach/work in a vacuum.</p>

<p>Momwaitingfornew that's absolutely the best, most reasoned explanation of the PA I've ever read. Well done.</p>

<p>I haven't followed this debate. The USNWR ratings generally seem a bit arbitrary and gameable but I do have to challenge one statement from hawkette:</p>

<p>"IMO, faculty quality does not differ that greatly among the top 30 undergraduate colleges in America. But it’s not that I’m trying to pull the highly ranked schools down. Most of the schools with high PA scores are terrific schools with high quality students and faculty, eg, gabriellah’s Johns Hopkins (4.7 PA score). But so also are schools like USC (3.9) and W&M (3.8) and Wake Forest (3.5) and Notre Dame (3.9) and Boston College (3.6) and Tufts (3.7) which all get shafted by this current methodology. If you’ve ever been to any of these schools and met the students and/or recruited at any of these schools, you know right away the quality of the product coming out of there (and perhaps even to some extent you can see what influence the faculty might have had in developing an individual). IMO the differences in the faculty and the undergraduate learning experience at these schools is close (or equal to or even better than) to that at Johns Hopkins or any of the other much more highly PA-rated schools."</p>

<p>There are two and maybe three separate things that you are confounding. One is faculty quality. This can mean several things, but the key one is research productivity and reputation. The second is the quality of the education. The third is the quality of the students.</p>

<p>It is entirely possible that quality of education and quality of faculty research are either uncorrelated or negatively correlated. For example, professors who got to teach at LACs go there to teach, frequently because they love to teach, whereas professors go to HYPSM to do research, are judged on research, are promoted on research. They may be good teachers and they may not, but that is not what motivates most of them.</p>

<p>However, that does not mean that faculty quality in terms of research capability and influence on the field does not differ significantly among the schools you describe. Harvard is dramatically superior to Duke in all of the fields I know about. There are good people at Duke, but there are probably quite a number of tenured professors at Duke who wouldn't stand a chance of passing the tenure hurdle at Harvard. I think the same is true with each of the comparisons you raised in an earlier post. There are some I can't evaluate, like USC v. UCLA, where I'd be operating without much data. But, if faculty quality has to do with research and influence, in the areas that I know, your statement that the top 30 faculties are about the same would be markedly incorrect.</p>

<p>Again, that does not mean that ABC company thinks students are better prepared at Harvard than they are at Duke. The Big 4 accounting firms don't try to hire from the top business schools. Accounting firms don't actually want the smartest people, in the useful but narrow sense that colleges select for. Neither do some manufacturing corporations. </p>

<p>However, if I were a high school student who thought I might want to go on for advanced degrees, especially in academic rather than professional areas, I would very much want to consider the faculty quality. And, if I were planning to go on in professional areas, I'd like to consider the impact of perceived student body quality on professional school admissions. My guess is that this is pretty tightly correlated with perceived faculty quality. </p>

<p>While I don't think perceived faculty quality necessarily tells us about student quality or about quality of teaching, I think that it would be a useful measure for high school students planning to go on in academia or planning to go on to professional schools.</p>