<p>^^^^</p>
<p>very well said</p>
<p>^^^^</p>
<p>very well said</p>
<p>I ditto Senior 0991’s earlier post. I am a Stanford student, and I will probably stay 5 years to get my Master’s (ie coterm). This seems pretty common, and many people I know here are doing the same thing. I’d guess that that’s one of the most influential factors on the 4-year graduation rate, but rather than reflecting poorly on my school I think it highlights the ambitious nature of my fellow students :)</p>
<p>edit: I’m obviously biased and not generally one for rankings/lists/etc, but I too was considerably surprised to see Columbia ranked above Stanford.</p>
<p>xiggi,</p>
<p>You’re right. The index score for Stanford looks like it was miscalculated. Does USNWR have anything in their methodology on how its calculated? It looked like the USNWR link to the methodology wasn’t working.</p>
<p>See [Methodology:</a> Undergraduate Ranking Criteria and Weights - US News and World Report](<a href=“http://www.usnews.com/education/best-colleges/articles/2010/08/17/methodology-undergraduate-ranking-criteria-and-weights-2011.html]Methodology:”>http://www.usnews.com/education/best-colleges/articles/2010/08/17/methodology-undergraduate-ranking-criteria-and-weights-2011.html)</p>
<p>Undergraduate Academic Reputation
Total 22.5% divided in
Peer assessment survey 66.7%<br>
High School counselor’s rating 33.3% </p>
<p>And</p>
<p>Undergraduate academic reputation (weighting: 22.5 percent for National Universities and National Liberal Arts Colleges; 25 percent for Regional Universities and Regional Colleges). The U.S. News ranking formula gives significant weight to the opinions of those in a position to judge a school’s undergraduate academic excellence. The academic peer assessment survey allows top academicspresidents, provosts, and deans of admissionsto account for intangibles at peer institutions such as faculty dedication to teaching. For their views on the National Universities and the National Liberal Arts Colleges, we also surveyed 1,787 counselors at public high schools from nearly every state plus the District of Columbia that appeared in the 2010 U.S. News Best High Schools rankings.</p>
<p>Each academic and high school counselor was asked to rate schools’ academic programs on a 5-point scale from 1 (marginal) to 5 (distinguished). Those who don’t know enough about a school to evaluate it fairly were asked to mark “don’t know.” The score used in the rankings is the average score of those who rated the school on the 5-point scale; “don’t knows” are not counted as part of the average. In the case of the National Universities and National Liberal Arts Colleges, the academic peer assessment accounts for 15 percentage points of the weighting, and 7.5 points go to the counselors’ ratings. Both regional rankings rely on peer assessment alone. In order to reduce the impact of strategic voting by respondents, we eliminated the two highest and two lowest scores each school received before calculating the average score. Synovate, a Chicago-based global opinion research firm, collected the data in spring 2010; of the 4,273 academics who were sent questionnaires, 48 percent respondedunchanged from last year. The counselors’ response rate was 21 percent.</p>
<p>^ Thanks. So, has USNWR corrected Stanford’s academic number in their on-line edition? If the wrong number was used in the calculation, it would be a pretty big error.</p>
<p>Perhaps USNWR is ****ed off at Stanford’s past public grievances against the rankings. </p>
<p>Interesting that USNWR now says explicitly they throw out the two highest and lowest scores. Before they weren’t as explicit in saying how they dealt with strategic voting.</p>
<p>UCB, I am sure they got over Stanford’s complaining voice about the poor treatment of a few publics in the ratings. </p>
<p>Regarding throwing out the highest and lowest scores, how does that work if someone places only 5s and 1s on the survey? A null and void result? If they throw out the highest scores, how can we have 5s? How do they decide which one IS the highest score among schools that have 5s.</p>
<p>
My interpretation is that USNWR received 2051 responses to its survey…Synovate plugs in all the raw results and then throws out two of the top scores (likely 5s) and two of the bottom scores for each school. So, IF all 2051 survey responses ranked all colleges, each college PA score would be based on 2,047 responses.</p>
<p>
Did they? Continuing to “punish” the top publics and the complainer seems like a probable solution. Conspiracy theory? ;)</p>
<p>Can someone post Vandy’s, Rice’s, ND’s and Emory’s Peer Assessment scores.</p>
<p>Am I the only one predicting a response along the lines of “We apologize for the mistype for academic reputation, as it is indeed 98, but the overall ranking is still unaffected by this change”</p>
<p>
</p>
<p>And Cornell!</p>
<p>
</p>
<p>Add me to that list</p>
<p>I would like to see USNews consider the rate at which accounting graduates pass the CPA exam.</p>
<p>
</p>
<p>US News top 30 PA scores
Harvard 4.9
Princeton 4.9
Yale 4.8
Columbia 4.6
Stanford 4.9
Penn 4.5
Caltech 4.6
MIT 4.9
Dartmouth 4.3
Duke 4.4
Chicago 4.6
Northwestern 4.4
Johns Hopkins 4.5
Wash U 4.1
Brown 4.4
Cornell 4.5
Rice 4.1
Vanderbilt 4.1
Notre Dame 3.9
Emory 4.0
Georgetown 4.1
UC Berkeley 4.7
Carnegie Mellon 4.2
USC 4.0
UCLA 4.2
UVA 4.3
Wake Forest 3.5
Tufts 3.6
Michigan 4.4
UNC Chapel Hill 4.1</p>
<p>Good catch on the Stanford mistake, but it definitely did not affect Stanford’s overall score or its relationship with Columbia et. al. The subscale rankings like reputation, graduation rate, and faculty resources are provided for reader’s interest and do not factor into the overall score. They use the raw peer assessment and guidance counselor rankings seperately after scaling them against the individual averages.</p>
<p>But I do wonder if Forbes getting the two Wheatons mixed up affected those school’s scores.</p>
<p>
</p>
<p>Is that so?</p>
<p>I am less upset by Columbia’s placement than I am with seeing Penn tied with Stanford and Penn ranking above what most people in my area consider to be better schools. Penn consistently rejects our high school’s best applicants in favor of lesser students. These it sometimes then must boost up via their pre-frosh program, because they’re considered at academic risk. Yield must be important in the methodology, lol. Is it? My computer is giving me trouble accessing the info.</p>
<p>US News top 30 PA scores/ HS counselor scores</p>
<p>Harvard 4.9 / 4.9
Princeton 4.9 / 4.9
Yale 4.8 / 4.9
Columbia 4.6 / 4.8
Stanford 4.9 / 4.9
Penn 4.5 / 4.6
Caltech 4.6 / 4.6
MIT 4.9 / 4.9
Dartmouth 4.3 / 4.7
Duke 4.4 / 4.7
Chicago 4.6 / 4.5
Northwestern 4.4 / 4.6
Johns Hopkins 4.5 / 4.8
Wash U 4.1 / 4.4
Brown 4.4 / 4.8
Cornell 4.5 / 4.8
Rice 4.1 / 4.4
Vanderbilt 4.1 / 4.5
Notre Dame 3.9 / 4.6
Emory 4.0 / 4.4
Georgetown 4.1 / 4.8
UC Berkeley 4.7 / 4.6
Carnegie Mellon 4.2 / 4.6
USC 4.0 / 4.4
UCLA 4.2 / 4.3
UVA 4.3 / 4.3
Wake Forest 3.5 / 4.3
Tufts 3.6 / 4.5
Michigan 4.4 / 4.4
UNC Chapel Hill 4.1 / 4.4</p>
<p>Schools getting the biggest boost from new “reputational” methodology (HS counselor score > PA score): Tufts + 0.9; Wake Forest +0.8; Georgetown + 0.7; Notre Dame + 0.7; Brown +0.4; Carnegie Mellon + 0.4; Dartmouth + 0.4; Emory + 0.4; USC + 0.4; Vanderbilt +0.4; Duke + 0.3; Johns Hopkins +0.3; UNC Chapel Hill +0.3; Wash U + 0.3; Columbia +0.2; Northwestern +0.2</p>
<p>Taking the biggest hit from new methodology: Chicago – 0.1; UC Berkeley – 0.1; and all schools whose HS Counselor score equaled their PA score, because so many schools got much higher scores from the HS Counselors that merely matching last year’s reputational score meant falling behind in a relative sense…</p>
<p>I’ll leave it to the mathematical wizards to tell us how much effect the substitution of HS counselor rankings for a portion of the PA score had on actual rankings, but for schools like Tufts and Wake Forest it’s got to be considerable. And as much as some people on CC love to rail against PA, these HS Counselor rankings have got to be far more absurd. I mean, Georgetown’s a nice school, but the HS Counselors gave it a 4.8—better than Duke (4.7), Caltech (4.6), Penn (4.6) and Chicago (4.5). Carnegie Mellon is Caltech’s equal at 4.6? Tufts is Chicago’s equal (4.5) and better than Emory, Rice, USC, and Wash U (4.4)? Indiana University-Purdue University-Indianapolis (IUPUI) is on the same plane with Tulane and Wisconsin (4.0) and better than Case Western, Lehigh, and Illinois (3.9)? Ridiculous. Whatever you think of PA, this is no improvement. It’s something considerably worse.</p>
<p>
</p>
<p>Clinton, are the GC scores really MORE ridiculous than the PA? There will always be scores that surprise, but can we not the same thing about the PA? One way to look at it is that the GC like the order of the OLD ranking but do not agree with the way the “experts” are doling out the PA. Perhaps, the GC did not think that schools ranked above Cal or Michigan deserved a substantially lower PA. </p>
<p>Fwiw, what do you think of the PA and GC scores of … Haverford? How do they compare to say Smith’s?</p>
<p>^ Only a few schools can have “distinguished” academic programs (ie PA score of 5)…Others should be rated around 4 - which means “excellent”.</p>
<p>can you guys expand that to top 31 PA scores? thanks</p>