<p>USNews provides a seemingly comprehensive table of numbers. The objective is obvious: pretend that this is a scientific experiment based on a scientific poll. Nothing could be further from the truth. </p>
<p>First, there is little guarantee that the form are filled with integrity or filled at all. A very generous estimation is that only 60% of the interview forms ever get returned, and that does not begin to address the partial replies. At least, the common data forms have to contain the required information. </p>
<p>Then, ones has to wonder about the various percentages that compose the final scores. I do not know about most people who read the scores but I find it strange how little value is given to the acceptance rates (1.5% of total) or the ratio of student-to-faculty (1%). I guess that not too many students are interested in ascertaining their chances at admission, and finding out how many teachers will be teaching at the school! Obviously, the graduation rates/performance have to be considered more important by a 6-to-1 ratio. </p>
<p>I also assume that information such as acceptance trends has little merit in evaluating schools. A school could double (or halve) its acceptance rates and it would not make much difference in the yearly statistics. That may explain why schools with acceptance rates of well above 30%, not to mention well above 50%, maintain very high rankings. Simply stated, Wellesley could accept 100% of its applicants, and it would not change its rankings. Actually, by lowering their SAT average scores, they would boost their rankings.</p>
<p>As far as the main "ingredient that is the peer assessment, we'll have to have faith that the Dean of Juniata or an obscure secretary at Transylvania University really know anything about the other schools. </p>
<p>PS The list of the weighed criteria:
Peer assessment 25%
Avg Graduation Rate 16%
Financial Resources 10%
SAT Scores 7.5%
Faculty compensation 7%
Class Size 1-9 6%
HS - Top 10% 6%
Alumin Giving 5%
Graduation Rate Performance 5%
Avg Freshman Retention 4%
Faculty Degrees 3%
Class Size 50+ 2%
Acceptance rate 1.5%
Percent Full Time 1%
Student/Faculty Ratio 1% </p>
<p>For entertainment purposes, pick two schools: Wellesley and Harvey Mudd. Run down the numbers and colums and stop each time one seems out of order. After make a pause at most Wellesley entries, you do not have to go much farther to see the HUGE penalty given to Mudd for its graduation performance. What did Mudd do that is so wrong? First, USNEWS assigns them an expected graduation of 97. Do you see ANY other schools in the country -including HYPS- with such a lofty standard? Why does Mudd have such a high number? Because their entering class has the highest SAT scores. So what does USNEWS do? They PRETEND to value Mudd selectivity rank, but then quickly penalizes the school for its "failing" graduation rates. </p>
<p>It is much easier to report a low SAT/huge acceptance rates and earn lower expectations. If that does not work, throw in a healthy dosis of grade inflation, fluff classes, non-supervised exams, and enough failsafe measures to ensure a high graduation rate. Add that to the highly suspect peer assessment, and with the complicity and duplicity of USNEWS, you are golden to earn another inflated ranking.</p>