USNews Peer Assessment

<p>The other thread is so long, that questions about this are buried. This is from the AP story on the 2006 rankings:</p>

<p>
[quote]
But some critics say the formula should be changed, arguing it fails to account for many aspects of educational quality.</p>

<p>More administrators appear to be protesting the rankings by declining to grade other colleges; that accounts for 25 percent of a school's ranking. The response rate has fallen from 67 percent in 2002 to 57 percent this year.</p>

<p>"No one can know for sure what is going on at another institution," said Marty O'Connell, dean of admission at McDaniel College in Maryland, who refuses to grade other schools.</p>

<p>Robert Morse, the magazine's director of data research, acknowledges that the response rate has slipped but said, "there's still a credible number of respondents per school."

[/quote]
</p>

<p>So, does anyone know: 1. Who are those "peers" and what are their biases (regional? where they got their degrees? LAC vs University affiliations? etc)
2. What are they asked? (quality of faculty? quality of research? quality of educational experience? where they would send their own children?)</p>

<p>Three people are surveyed at each school: the President, the Provost, and the Dean of Admissions.</p>

<p>They are only asked to rate schools in their category. For example, LACs are only rated by those three officials at other LACs. National universities are rated by those three officials at other national universities.</p>

<p>They are only asked for a simple rating of each school on a 1 to 5 scale.</p>

<p>It is basically just a name-brand recognition survey with some serious geographic biases. For example, it is a joke that Pomona does not have a "peer assessment" in the same range as Swarthmore, Williams, and Amherst. The only plausible explanation is that there are so few liberal arts colleges on the West Coast that there is little direct regional peer recognition -- a geographic penalty.</p>

<p>Thanks for the info. I think Pomona is also hurt because, though it's always been a good school, its rise to excellence is more recent and I'd guess a category like peer assessment would tend to lag.</p>

<p>It seems as if other biases are also introduced by only asking administrators. For example, the President at some institutions comes from professional school faculties or even from outside academia. At some instituitions, the President traditionally is an alumn. And what would a Dean of Admissions typically know except what colleges his or her institution is mainly competing against for students?</p>

<p>I don't know about a recent rise. Pomona was founded just 20 years after Swarthmore. Any difference in the "rise" had more to do with the economic rise of Los Angeles viz-a-viz the "old-money" northeast corridor. In many ways, it's a pre-WWII versus post-WWII issue.</p>

<p>While the return of the surveys is pretty pathetic, the issue is really about who does the actual replying and why the person polled is supposed to be an "expert".</p>

<ol>
<li><p>Knowledge
Are we kidding ourselves to believe that the Dean or Provost of Grinnell knows enough about both Swarthmore and Sewanee to fill the survey with recent knowledge?</p></li>
<li><p>Source of information
Cannot we not assume that the best source for information for the "experts" is simply to read last year issue of the report? The peer assessment becomes is a self-fulfilling prophecy. Nobody wants to be a fool, so why not err on the safe side?</p></li>
<li><p>Integrity
It is a known fact that the survey has been manipulated and is marred by the most abject geographical cronyism. Would it surprise anyone that all Seven Sisters schools give one another a full five and make sure to give low grades to competing schools? How else, could you explain some of the ridiculously high peer rankings?</p></li>
<li><p>Identity of person filling the form
How many deans or provosts abdicate this exercise to an obscure secretary o r an intern? Considering how valuable the time of academics truly is, one has to wonder how important the survey is.</p></li>
</ol>

<p>There IS an easy solution to all of this: make the survey public on a website. This way the information would be easily verifiable by ALL. After all, schools should not be afraid of having their "opinions" scrutinized and verified for accuracy and integrity ... unless the data is better kept secret for reasons that are not hard to guess.</p>

<p>They are only asked to rate schools in their category. For example, LACs are only rated by those three officials at other LACs. National universities are rated by those three officials at other national universities.</p>

<br>


<br>

<p>I could be wrong but I thought it was even more specific than that. They are only asked to rate other schools in the same category within the same region of the country. In other words, there aren't hundreds of peers rating Harvard, just peers in the northeast.</p>

<p>As far as rising in excellence, one has to wonder about the inner workings of the US News formulae:</p>

<p>While I understand that graduation rates are important, let's look how USN uses it as a double whammy with the expected graduation rate. For instance, the expected graduation of Harvey Mudd is 99% while Swarthmore's and Pomona's are at 96%. Does USNews not see the ridicule of expecting one of the toughest engineering school in the country to have a 99% expected graduation rate? This results in a penalty of MINUS 16! Caltech -the school most comparable to Mudd- gets an expected graduation rate of 90%. Here we may assume that USN paid attention to Stanford's Gerhard Casper letter, but only changed the demand placed upon Caltech. </p>

<p>The way I see it USN found the most insidious way to penalize schools that are most selective and protect the desired status quo. Lower selectivity at a favored school? No problem: slap a lower expected rate and it will results in a few bonus points. Wellesley gets an easy earned PLUS 4 in that category and that mitigates the impact of its lower selectivity. Swarthmore gets a MINUS 5 and Pomona a MINUS 6. If you want to track the reasons behind the drop in the "quality points" of Pomona and Swarthmore, do not look farther. In this case, it would be to the greatest benefit of the school to underreport its SAT numbers. Come to thing about it, Middlebury might be onto something here, as they dropped their reported SATs for 2009 from and average of 1440 to ... 1315. That should earn them some solid bonus points in the future! </p>

<p>USN is moving rapidly from being misleading to being blatantly dishonest.</p>

<p>Can you refresh for me the source of this rate?</p>

<p>Graduation rate performance </p>

<p>This indicator of "added value" shows the effect of the college's programs and policies on the graduation rate of students after controlling for spending and student aptitude. We measure the difference between a school's six-year graduation rate and the predicted rate for the class.</p>

<p>Nope, I was wrong: it's only similar schools, not divided locally. Here's the info from the US News methodology explanation:</p>

<p>Peer assessment (weighted by 25 percent). The U.S. News ranking formula gives greatest weight to the opinions of those in a position to judge a school's academic excellence. The peer assessment survey allows the top academics we contact--presidents, provosts, and deans of admission--to account for intangibles such as faculty dedication to teaching. Each individual is asked to rate peer schools' academic programs on a scale from 1 (marginal) to 5 (distinguished). Those who don't know enough about a school to evaluate it fairly are asked to mark "don't know." Synovate, an opinion-research firm based near Chicago, collected the data; 60 percent of the 4,095 people who were sent questionnaires responded.</p>

<p>By the way, I posted this link before but it's an excellent break down of the validity of each section of the rankings, and worth a look:<a href="http://www.johnlocke.org/acrobat/pope_articles/collegeranking-inquiry17.pdf%5B/url%5D"&gt;http://www.johnlocke.org/acrobat/pope_articles/collegeranking-inquiry17.pdf&lt;/a>
I particularly like the analogy below:</p>

<p>"Suppose that we wanted to know which cars are the safest, and which the least safe. The way to approach the problem would be to perform tests directly on a sample of each vehicle to see how well they stand up to crashes.But what if such direct testing was not allowed or was not feasible. So, we devise an alternate safety evaluation system that rans vehicles based on:</p>

<ol>
<li><p>a questionannire sent to three executives at each auto manufacturer, asking them to give numerical ratings to each car made in accordance with their views on the car's reputation for safety.</p></li>
<li><p>The selectivity of the manufacturer in its hiring of employees</p></li>
<li><p>Employee satisfaction</p></li>
<li><p>How much it costs to build the vehicle</p></li>
<li><p>The percentage of return customers</p></li>
<li><p>The percentage of management holding MBAs or other advanced degrees. </p></li>
</ol>

<p>Would the calculations from that system enable us to say with any confidence that the top ranked vehicles were really the safest and the lowest ranked were really the least safe? "</p>

<p>Carolyn, I think you have summed up the rankings. I really question the accuracy of the ranking. How can we quantify the whole college experience into one number. I can't possibly imagine how Upenn can be ranked ahead of Stanford.</p>

<p>I'm pretty sure that "expected rate" is based--at least in part--on research done by Astin at UCLA. I don't know if they used his data or copied the methodology, but it is essentially a figure where they throw some inputs into a regression to see their relationship to graduation rates. I don't know if the original prediction used student-level data or institution-level data. I can't even recall if I read the original study! My brain, she is a tired thing.</p>

<p>Xiggi:</p>

<p>USNEWS should just be satisfied with a "single-whammy" and count the graduation rate in their formula.</p>

<p>To then add a second whammy of performance versus "expected grad rate" is ridiculous. What it really does is penalize schools for being demanding academically. The shame of it is that the schools penalized by this measure still have among the highest grad rates in the country. </p>

<p>IMO, if you graduate 97% of your incoming freshmen, you need to make the academics harder.</p>

<p>
[quote]
What it really does is penalize schools for being demanding academically.

[/quote]
</p>

<p>That's true of the colleges at the end of the ranking focused on on these boards.</p>

<p>It is not true for colleges which are less selective. Consider a school which recruits a student body comprised of students who, in the general population, tend to graduate at a 50% rate. But this school gets 70% of them out the door with a degree, because it knows how to make students maximize their potential, because it does great advising, because it offers good financial support, and so on. The college is doing something right--maybe even something extraordinary. The differential recognizes that.</p>

<p>One could argue that the kinds of motivated, bright students who get into top colleges would likely succeed (and graduate) anywhere. Expected grad rates certainly suggest that. These institutions simply can't be rewarded much in this formula, because the expected graduation rate is already so high. Their only comfort is that they're all approximately on the same page on that. It's true some of them get punished more than others. Any which have an environment more intense than new students expected are going to get dinged for it. On the other hand, so are colleges which don't care about burning out students or who practice crap policies like bait-and-switch aid. A college like Swarthmore that spends so much money per student (which is apparently in the formula) is especially vulnerable.</p>

<p>Hoedown~</p>

<p>I am not sure if USN borrowed much from Astin's research, but their model for expected graduation rate needs a little tweaking. </p>

<p>If you have access to the online rankings, just click on the colums for selectivity and then to the over/under performance for graduation. Harvey Mudd is ranked number one in the first, and if it were not for Reed and another school that are blanked out, guess the school that is dead last: Harvey Mudd. </p>

<p>Since both elements are supposed to be a measurement of quality, you would expect someone at USN to check the model for such large discrepancy. I understand that no model will ever be perfect, but one has to wonder how this particular criterion was developed and how thought was invested in its application. From an outsider's point of view, the actual model seems to track the SAT scores pretty well. As we noted before, schools with higher scores seem to earn penalties or bonus points that are a bit "suspect".</p>

<p>"IMO, if you graduate 97% of your incoming freshmen, you need to make the academics harder."</p>

<p>I agree 100% with Interesteddad. At a minimum, USN should introduce a level that represents a theoretical maximum. Just as full employment is not the same as zero unemployment, I believe that a school that is able to graduate 100% of its students ought to be an oddity. I would think that an attrition rate of 10% over four years should be considered a remarkable number for selective schools. </p>

<p>On a side note, I would love to read the comments of Harvey Mudd's students and faculty about having a minimum rate of 99% graduation rate.</p>

<p>Xiggi:</p>

<p>Particularly because the published graduation rates do not include students who transfer INTO a college. Let's say, just for kicks, that we have two students -- one a perfect Pomona student and one a perfect Claremont-McKenna fit.</p>

<p>Our two students make a mistake and up at the wrong schools: Ms. Pomona enrolls at CMC and Mr. CMC enrolls at Pomona. Both realize they chose the wrong school and successfully transfer to their proper schools at the end of freshman year. Both live happily ever after. </p>

<p>But, both schools get "dinged" for a graduation rate "failure".</p>

<p>Since transfers are a fact of life at virtually every school, how could a 100% graduation rate be a rational target? It would actually be kind of weird to have a 100% graduation rate.</p>

<p>BTW, of the 104 LACs in the USNEWS first tier, only 3 of them hit their "predicted graduation rate" on the number and only 22 more came within 1% in either direction.</p>

<p>Seems to me that the "model" needs a little work!</p>

<p>On the lighter side - I suppose the President of Pomona was between a rock and a hard spot when filling out his peer assessment form. He sent his daughter to Carleton ('08).</p>