Just how ridiculous is Peer Assessment?

<p>Tufts is certainly one of the most perplexing rankings. I think it has to do with the perception that it's THE Ivy safety.</p>

<p>the thing about tufts--well think of it this way--if you took tufts out of boston would it still be "the" ivy safety? its location has a lot to do with its success, imo.</p>

<p>The Tufts faculty seem to lack elite level profs. Only 5 NAS meners and only 6 major facutly awards won. The better schools like like Chicago have 51 NAS members and 23 awards winners. The real elites have over 100 NAS members and 50+ major faculty awards.</p>

<p>I think an emphasis on Peer Assessment is problematic because it emphasizes perception over reality. </p>

<p>For example, look at how majors like undergraduate business are ranked by US News. They are based solely on peer assessments from deans and department heads, and don't take into account real output metrics like: </p>

<p>How many graduates have jobs upon graduation?<br>
What's the graduation rate?
What's the admission rate?<br>
What's the average starting salary for graduates?<br>
How do the students themselves rate their school?<br>
How do company recruiters rate the various schools?<br>
How good are the schools' alumni at getting into top MBA programs.</p>

<p>A school may be perceived as worse than another, but is it truly worse if its graduates are doing better?</p>

<p>ADDENDUM:
Overall, I hate rankings, and for US News to make more sense to me, it needs to include more output metrics to determine if a school is truly doing it's job. It already includes graduation and retention rates, but how about the # undergrads who end up in top graduate programs? Or the number of students who are recruited and hired upon graduation? Or the # of graduates who are leaders in various fields and industries or Fortune 500 companies or the world's greatest philanthropic organizations?</p>

<p>The # of metrics are endless, but showing more outputs can reveal if a school is actually creating thinkers, successful alumni, and future leaders of America and the world.</p>

<p>i didnt read through all of this thread, but what is stopping someone who is filling out the survey to 'underrate' their peer schools in order to inflate their own institution??</p>

<p>
[quote]
As I have shown, they are extremely flawed and in no way indicative of a college's true value.

[/quote]
</p>

<p>Sorry, but you haven't shown anything of the sort, but your opinion of how the schools should be ranked/assessed. While I do not denigrate your opinion, I also do not denigrate the collective opinion/vote of academic deans and provosts.</p>

<p>a) globalist, that would require work on their part rather than using common data set numbers
b) huskem, nothing. notice the ~50% return rate from the surveys. guess what, the ivies have a 100% return rate. brown will rate hypccdp as a 5.0. regardless of how they fair, if the other ivies remain the dominant "group" of universities in america, it continues to bolster their own rep.</p>

<p>My main issue with peer Assessments is that schools can suffer by association.
For example, when ranking Ivies, one subconsciously compared them to other Ivies. Cornell doesn't look nearly as good when it's standing next to Harvard.
But when ranking a school of similar quality, such as Emory, there's no immediate temptation to compare it to Harvard.
Therefore, this reviewer might have given Emory a better rating than Cornell, but if they were actually to compare them head to head, they probably would have preferred Cornell.</p>

<p>xiggi:</p>

<p>hope you didn't miss your favorite ed guy, Lloyd Thacker on the Today Show. Man, I hope his PR team gives him better talking points, since he was creamed by the editor from USNews. </p>

<p>Llloyd basically made two points, (which in my mind are inconsistent): 1) USNEws ranking is bad because "they measure the wrong things"; 2) he is bringing colleges together at Yale bcos "colleges don't know what to measure..." How, then, Lloyd, can USNews be measuring the wrong thing if the colleges themselves don't know what the right things are? :rolleyes:</p>

<p>Be a little bit careful in sorting the columns in the online edition - Example: if you sort the SAT column, you are going to get a skewed view of that criterion - it will sort by the first of the two SAT scores given and give misleading results. For instance, based on combined SAT scores, Tufts isn't higher than half the ivies as stated above. Still, it is certainly high and Tuft's PA score, which seems much lower than it should be, really affects its rank.</p>

<p>And YAY for Richard Cook! Allegheny, and a good number of schools in its range, provide an excellent education - not that USNEWS devotees would ever know it. Each and every senior at that school is required to write and defend a senior year comprehensive paper and the quality of the faculty, and their devotion to students, the internship, research and service opportunies, are all excellent. </p>

<p>As is the case for a number of schools which fall below the top 50 or even 75 level. But the longer a ranking system like this exists, the more it becomes, to a degree, a self-fulfilling prophecy. As long as students, quite naturally, are tempted to measure their own worth by association with the highest ranked schools, they are going to over-look some excellent schools in their college search. Schools which were once lumped together in peer groups as "most selective" or "very selective", are now given numbered rankings which imply a greater level of distinction than could possibly exist. And as a result, certain excellent schools have fallen from one level to the next because they can no longer attract top students, without any loss in quality of education - absolute or relative.</p>

<p>The most useful way to use the compiled information in the annual USNEWS rankings is on a column by column basis, and in association with other guides. That the most heavily weighted element of the rankings is a rating which is purely subjective, subject to manipulation, and hardly, as Richard Cook concluded, one which "peers" are universally qualified to make, has helped create artificial and in many cases, unfortunate, distinctions which, over the long-term, detrimentally affect some fine institutions and diservice students. But it is helpful to look at class size, faculty student ratio, etc. as part of the selection process.</p>

<p>
[quote]
xiggi:</p>

<p>hope you didn't miss your favorite ed guy, Lloyd Thacker on the Today Show. Man, I hope his PR team gives him better talking points, since he was creamed by the editor from USNews.

[/quote]
</p>

<p>BB, thanks for mentioning it. I did not see the piece, but it does not surprise me. I have often compared him to a mad dog who barks at every passing car without knowing WHY he does it. I have seen other interviews abd he does poorly as it shows how limited his views and understanding of the issues are. </p>

<p>Most of his "ideas" are deeply flawed and hypocritical, even when --by accident-- he ends up attacking the Peer Assessment. </p>

<p>Lloyd Thacker is a hired mercenary who is a mouthpiece for colleges. He does not represent the interests of families and students --despite his claims.</p>

<p>
[quote]
are a pretty good indication that the education establishment is loathe [sic] to permit these colleges to rise in the rankings.

[/quote]
</p>

<p>Using the verb "permit" hearkens back to your repeated insinuation that respondents work in collusion to make sure that their answers collectively accomplish certain goals.</p>

<p>I think that's ridiculous. Maybe their rankings are underinformed, or biased, but it has nothing to do with "permitting" rankings to stay at or reach a certain level, or plotting what someone's ultimate rank will be once the thousands of ratings are compiled by the folks at USNews.</p>

<p>I recall seeing someone post last year's ranking were the PA score not included. I for one think that someone would do the CC community a great service if they created a thread that had the ranking (if only of tier 1 universities and tier 1 LACs) that excluded the PA score. Maybe then we would have a better ranking that is based primarily, if not solely on data. I wouldn't mind doing such a thing, but I'm not sure how to go about doing it.</p>

<p>brand, that would definitely be interesting to see</p>

<p>All summarize the top 10-15 schools if PA score wasn't included: Chicago, Cornell, JHU, and Columbia would all fall.</p>

<p>Brown, Duke, Dartmouth would all rise.</p>

<p>also, peer assessment levels the playing field somewhat. for example, everyone harks on berkeley being the greatest thing since sliced bread--yet if you removed PA for the rankings, I wouldn't be surprised if Berkeley dropped to the 40s. Look--the schools that rank directly above berkeley (Notre Dame, Vanderbilt, Emory Rice, etc.) have a 3.9, 4.0 and 4.1s PA respectively--versus berkeleys 4.8. So in a sense, Peer assessment is an X factor some schools and USnews rely on. Its the shotgun from gears of war for college rankings.</p>

<p>the point of PA is to include everything that isn't exactly numerically measurable into a numerical score. namely, faculty quality and respect in the academic community. sure you can choose a metric to measure the quality of a faculty (say nobel prize winners) but only a handful of schools have them. Sure you could say faculty X has won Y number of awards, but how much does each award count? What about percentage of faculty with awards versus absolute numbers? How come some awards are better than others, and who decides which ones are? Are research awards included or should just awards which involve undergraduate teaching? What about smaller schools where the faculty just hasn't really won any awards but still offers a solid education? Do they get 0s in faculty quality? PA attempts to tidily sum up all of that with a questionaire to people who ARE in the know.</p>

<p>if you remove PA from the rankings, i feel like you have a list of interchangable statistics which basically tell you how hard it is to get into a school, and what percentage of students graduate on time.</p>

<p>Briantheman, I don't think it's too much to ask US News to do more work finding more comprehensive information about schools and their graduates. It's simple research. We know who are the leaders in industries, Fortune 500 companies, etc. Many graduate schools list where their students went for undergrad. The Wall Street Journal and Business Week kind of does this already with their lists of schools that produce the most CEOs or their list of schools that have the most students in the top graduate programs.</p>

<p>When I get a chance, I'll post the top 50 schools minus the PA.</p>

<p>I completely agree with previous posters who say that if one were to do proper rankings, the rankings should be about how well a school's graduates perform in the workforce and at top grad schools. Isn't that the #1 reason why people go to college?</p>

<p>Pat:</p>

<p>There was a thread about PA created by "Hawkette" in the not too distant past with many pages of debate. But in one post, she provided a look at the rankings with PA taken out. Maybe you could search for it and save yourself some time. </p>

<p>Barrons:</p>

<p>Is the MUP/ASU the source of the NAS info? Could you post a link please?</p>

<p>
[quote]
if you remove PA from the rankings, i feel like you have a list of interchangable statistics which basically tell you how hard it is to get into a school, and what percentage of students graduate on time.

[/quote]
</p>

<p>Interchangeable statistics without the PA? </p>

<p>Does this mean that the entire ranking then depends on the PA aka the opinion of a few. An opinion seems to be based on gamemanship, lack of direct knowledge, or a blatant disregard of what they are supposed to measure.</p>

<p>Of course, when measuring the quality of education at the undergraduate level, we SHOULD really measure the graduate school, the reputation of teachers who may not have seen the inside of a class in years, count the Nobel Prize winners that are ghosts to undergraduate; we should overlook the 300-500 persons auditoriums and the small army of Teaching Assistants, Fellows, or whatever fancy names a school tags on its indentured servants who are magically anointed with the capability of being teachers. </p>

<p>Sure!</p>