<p>I read the article by Gerhard Casper. He doesn't know what he is talking about. This is a statement of fact, not opinion. He is just another squirming worm administrator trying to avoid accountability. He didn't do his homework and I give him an "F".</p>
<p>The US News rankings were the best thing to happen to consumers of higher education since universities stopped burning heretics at the stake.</p>
<p>Take the concept of "value added" for example. US News does a calculation that predicts what a school's graduation rate should be based on SAT scores, expenditures per student, and (I think) high school rank. It is a very sophisticated and accurate formula based on a statistical technique called multiple regression. There are different formulas for public and private universities. By comparing a school's actual graduation rate with it's predicted graduation rate you get a measure of how well a university does with the caliber of student it admits. This is an ingenious way to assess "value added". Expenditures per student have a negative weight in the formula, I believe. Expenditures are probably a proxy measure of how "techy" the school is. Tech schools have higher expenditures per student and they also have harder curricula (e.g. engineering) so the graduation rates tend to be lower. I also think expenditures increase as the proportion of graduate and professional education increases. So, I think expenditures per student is a way to account for the curriculum difficulty and an emphasis on research rather than teaching undergraduates. This is the way to think about "value added".</p>
<p>Years ago, Caltech had an 85% graduation rate. Now it is higher. Caltech has the smartest student body in the world. No way should 15% drop out. They dropped out because the quality of undergraduate education at Caltech sucked. US News detected this.</p>
<p>Why do the rankings change from one year to the next? One reason is that US News improves its formulas. Another reason is that schools change the way they do their calculations to make themselves look better. Perhaps US News shoould be more specific about how to do the calculations. Sometimes universities make mistakes in reporting data and then make corrections. This is just the way it is when you deal with data. The changes in rankings from year to year do not diminish the overall accuracy and value of the rankings. Gerhard Casper was deliberately trying to undermine confidence in the data and method like the good politician that he is.</p>
<p>Yes, there is a difference between #1 and #2. Fine distinctions can be made mathematically. The distinctions may be small but they are incrementally real. The question is: at what point can you "feel" the difference.</p>
<p>The Ivies are superior to non-Ivies in ways that can make a real difference. It is a matter of degree, of course. The Ivies offer uniformly excellent faculty, fellow students, a great culture and climate, exciting intellectual atmosphere, a high level of instruction and discussion, and so on. The prestige factor in the Ivies is based on underlying quality. It is deserved, not imaginary.</p>
<p>The public Ivies belong where they are in the rankings. For one thing, they are not as selective as the Ivies and top non-Ivies. Student quality at the top publics is inconsistent. I also think the emphasis on spectator sports at top publics is not a good thing, although this is not part of the rankings. The climate at publics is not as good as at the Ivies or top non-Ivies. Big difference in culture.</p>