<p>Though i've posted this on previous threads discussing the Peer Assessment, i think it bears repeating. In sum, the problem with the Peer Assessment are at least two fold:</p>
<p>1) An inherent bias with such a survey
2) The impossibility of being able to accurately "grade" every university out there</p>
<p>The closest analogy I proposed in the past was citing similar weaknesses with the NCAA College Football Coach's Poll (here are some of my previous posts):</p>
<p>
[quote]
I don't think that anyone is questioning the level of intelligence (or resumes) of those who vote for the peer score survey.</p>
<p>I think a relevant analogy is the NCAA Football Coach's poll for the BCS Championship. Each coach puts in their vote, and this poll is a critical part of the BCS rankings (it isn't the entire BCS ranking, but it is a critical component, much like the peer score is a critical component in the USNWR ranking).</p>
<p>Now, there is an inherent bias in this poll. Coach's have their own agendas when they vote (whether it be to boost their own strength of schedules or boost their own conference members). That is why at the end of this year's football season, the OSU Football coach declined to vote in the last Coach's poll because he felt there was a direct "conflict of interest" (i.e. voting between Michigan vs. Florida) which obviosly had direct National Championship implications (i.e. its a lose-lose for him, if he votes for Michigan, he gets criticized, if he votes for Florida, he gets criticized). Further, one of the other criticisms of this poll is that no active D1-A head coach is going to find the time to watch and analyze every Top 25 team in the country --> in point of fact, they are rarely looking at anything but film on the upcoming opponent (e.g. even Michigan's coach declined to comment on Florida's team because he just "hasn't seem them play") --> and yet, these Coach's are asked to rank the Top 25 every week.</p>
<p>The point? No one will argue that these Coach's understand the game inside and out, better than the average person will ever understand. Hundreds of hours of experience and film. But so what? That doesn't mean that these Coach's won't be affected by personal / professional bias --> they are rational people and will vote in a manner that best benefits them. Period.</p>
<p>So in much the same way, the folks who vote in the peer score will vote with their own personal bias. There is no escaping the inherent bias embedded in such "polls" (be it the BCS Ranking or Peer Score Ranking) --> each person will vote in a manner that best benefits them. Who cares if the person voting has a resume a mile long, it still doesn't give me any comfort on why their opinions matter on the relative merits of a Dartmouth vs. a University of Wisconsin. What makes the PA even worse is that there is ABSOLUTELY NO transparency.
[/quote]
</p>
<p>
[quote]
And this is the fundamental problem I have with the peer assessment, it's not a knock against the intelligence or experience of those participating, simply speaking:</p>
<p>1) It's not their job to know the differences between hundreds colleges
2) Even if was their job, there would be an inherent bias anyway</p>
<p>Furthermore, as mentioned before, the other fundamental problem with the Peer Score is lack of transparency:</p>
<p>3) Who are the people actually voting? Why don't they disclose who they are, and more importantly,
4) How they voted?
5) i.e. Why don't they make these peer rankings public? i.e. who ranked them and how they ranked the colleges (i have a strong suspicion that if these votes/rankings were made public and each vote had their names attached to them - they would either decline to be involved or the outcome would be different)
6) Since there is no transparency, this is the ultimate "X" / "fudge" factor --> adding / subtracting a couple of 1/10ths of a decimal point here and there until you get the list you like (i.e. ensuring not only some variance year-over-year, but that you are effectively in control of that variance).
[/quote]
</p>