<p>
Alex, myself, bclinton, that provost at Wisconsin…</p>
<p>I say it’s pretty accurate for academic programs at research universities. I understand your frustrations with the PA for LACs, which IMO, are harder to measure.</p>
<p>
Alex, myself, bclinton, that provost at Wisconsin…</p>
<p>I say it’s pretty accurate for academic programs at research universities. I understand your frustrations with the PA for LACs, which IMO, are harder to measure.</p>
<p>
Ummm, I think most provosts and academics can quickly identify in less than 10-15 seconds which research universities have distinguished, strong, or marginal, or “don’t know ****” about academic program reputation. This isn’t rocket science…It’s opinion on a rating of 1-5!</p>
<p>Quick! How is my post? On a scale of 1-5? :)</p>
<p>
</p>
<p>The one with the 260 adequate and did not like Arizona? </p>
<p>Look at how different the “justifications” are. According to Clinton, the “good” provost has reams of data and spends countless hours on such exercise. However, you seem to be happy when someone spends 2-3 seconds per little bubble. Can you ever read the name of the school one appraises in 2 seconds?</p>
<p>
</p>
<p>My secretary looked up how I ranked your post last year. And she marked “strong” again, before returning to the filing of papers and nails!</p>
<p>
No, the new Provost tsdad mentioned in Post #155…the one who said he wasn’t “okay” with the survey responses.</p>
<p>Also, he didn’t like Arizona State…not Arizona. </p>
<p>
Um, where did I ever say 2-3 seconds? I said less than 10-15 seconds.</p>
<p>Still obsessing…LOL. Whatever, people!</p>
<p>^ It’s not obsessing over the rankings per se…it’s like arguing politics with your family over Thanksgiving dinner.</p>
<p>And Xiggi makes everyone’s thanksgiving invite as part of the universal family.</p>
<p>From 2000 to 2008, the President of Wisconsin was John Wiley. I actually knew him personally. He was a well respected professor in materials science/physics/nuclear engineering departments when I was there as a graduate student. He might not know all the 260 schools, but he could pick the top 30 schools in 2 minutes, easily.</p>
<p>Has anyone brought up the proportion scores? Is it all about the ranking? For instance, look at the change in the top 7 publics. </p>
<p>2010 Data
Rank Score
UNC 30 70
Uva 25 73
Michigan 29 71
Berkeley 22 76
UCLA 25 73
W&M 31 67
Gtech 35 63</p>
<p>2011 Data<br>
Rank Score
UNC 29 74
Uva 25 76
Mich 28 75
Berke 21 79
UCLA 25 76
WM 33 69
Gtech 36 67</p>
<p>Almost all of these schools’ scores improved relative to the number one ranked school (score =100). UNC improved 4 pts, UVa 3 pts, Michigan 4 pts, Berkeley 3 pts, UCLA 3 pts, WM 2 pts, GTech 4 pts.</p>
<p>Also, keep in mind that UCLA/Wake/UVa are only 1 pt off of Michigan, with UNC 1 pt behind UM as well. The gap in the USNWR rankings between these schools (judging by the score) is minuscule. </p>
<p>Sure the ranking is important because its the most visible value assigned to each school but the composite score underneath the rankings seems to suggest 1) top publics improved in relation to Harvard/Princeton (#1) and 2) there is little difference between them according to the USNWR metric — right?</p>
<p>^ Thanks for posting. The top publics are all very close and great choices for undergrad. Difference is when you start adding factors like graduate/professional schools and faculty reputation that a couple schools seem to break away from the pack.</p>
<p>Oh definitely, I’m not saying that the quality of these schools is being perfectly captured by USNWR. Michigan and Berkeley’s graduate reputation, for instance, isn’t emphasized by the report. USNWR undergraduate ranking /= actual quality of the institution in a lot of respects.</p>
<p>I’m just saying that within the confines of the USNWR ranking system – the actual ‘ranking’ doesn’t create a comprehensive picture. </p>
<p>Edit: And large differences (in USNWR quality) suggested by the rankings may not actually exist.</p>
<p>
</p>
<p>UCB, I do not have more frustrations with the PA of the LACs. The nuttiness of the scores for a number of outliers are simply easier to spot, and the reasons of geographical and historical cronyism easier to spot. Otherwise, my issues with the PA are similar for both universities and LACs. In the current format, both are pure garbage. </p>
<p>At the risk of repeating the same argument ad nauseam, I have often written that I am NOT opposed to a survey that polls people who are supposed to know. I have spoken in favor of expanding the PA, or even using it as the sole source for a reputational ranking. What I have been opposed to is the misrepresentation of its impact, its scope, and the validity of the CURRENT format. </p>
<p>For a survey to have a modicum of validity, it should be clearly defined and go well beyond the overly simplistic question that is posed today. How hard would it be to create a few columns for each school. Currently, we have an incredibly lengthy and detailed CDS form, but when it comes to the opinion we are supposed to value the most (as the methodology indicates) we should be satisfied with a mere 1-5 system with ONE open-ended question. </p>
<p>Further, the survey should be transparent and public. Is there ANY reason why the people who are trusted with such important positions would hesitate to let the public know about how they rank their competition. Why do public officials in the world of education have the right to cast anonymous (to the public) ballots? If such right exists, it should be punted immediately. </p>
<p>Anyway, it makes no sense to discuss this further. Nothing will ever change. People who support the PA do it ONLY because their favorite schools do “well on it.” If Cal and UCLA would be granted the same PA, there would be plenty of “opposition.” It does not get simpler than that. On the other hand, many of the schools I “favor” do exceptionally well on the PA. </p>
<p>My bottom line is that the USNews rankings reward the cheats and the crooks. There used to be good reasons to overlook the minor issues of integrity. Those have slowly but surely eroded. There used to be good reasons to spend 15 to 25 dollars on the annual edition. And if it was not enough that the geniuses who work for Morse ruined the presentation, they now add insult to injury by giving it away! Shows how much the stuff is worth! </p>
<p>I would be surprised if the rankings survive Obama’s stay in the White House. Although I am clearly hedging my bet here as I wonder how much they will be missed.</p>
<p>PS I did get my refund after three emails and a phone call.</p>
<p>“Difference is when you start adding factors like graduate/professional schools and faculty reputation that a couple schools seem to break away from the pack.”</p>
<p>…and of course we both know which two schools those are. :-)</p>
<p>
</p>
<p>CMC? and Smith? :D</p>
<p>
ARWU to replace? Nice! Yep, everything’s being manufactured in China these days.</p>
<p>It seems like… the premium version is free for the next two hours because Google has sponsored it?</p>
<p>
</p>
<p>I wish for an explanation for this arbitrary number (Top 13?) and for the exclusion of schools like Northwestern and Johns Hopkins.</p>
<p>^^^Dunno. You to go to #16 to include all 8 Ivys.</p>
<p>
Michael Lewis isn’t even a top 25 author, so how would he be qualified to write about this?</p>
<p>
</p>
<p>Its called LSP, but yeah now they are including it with total #'s. Really wish NYU gave out stats for individual schools though lol.</p>