<p>^^–^^</p>
<p>Was that really the case, UCB?</p>
<p>^^–^^</p>
<p>Was that really the case, UCB?</p>
<p>^ Er, right…sorry…you thought the Berkeley Economics website listed all courses. Which was not the case…only courses that had dedicated homepages were listed. But, if you attended a large, bureaucratic university you would have realized that… ;)</p>
<p>“If there was a way to analyze the correlation of the PA to the five prior years and do this on a rolling basis since it became a feature of USNews>”</p>
<p>Of course there’s a way, formulate various combos of the prior PA scores that seem plausible, (such as 5 prior years as you suggest, but there are many possible constructs), try each of them in the regression analysis, sequentially, and see if the overall predictive power of the resulting regression model is better with one of these lagged variables added to the model than without them.</p>
<p>Xiggi, I don’t understand your post #159.</p>
<p>Regarding the LAC analysis, when you add the factors sequentially (stepwise), here is how much each factor adds to the explanatory model of PA. The first factor entered usually explains the lion’s share of the dependent variable ¶ and that is so in this analysis. The whole model explains 91% of PA.</p>
<p>74.2% SAT math 25th natural log
9.1% instructional expenditures percent
2.3% endowment per FTE natural log
2.6% number of bachelors degrees per year
1.2% SAT CR 75th squared
0.6% admit percent natural log
0.3% women or coed
0.3% public or private
0.2% number of freshmen natural log
0.1% admit percent squared
0.2% number of freshmen</p>
<h2>0.1% number of freshmen squared</h2>
<p>91% total (hopefully)</p>
<p>The first 5 factors account for 89% of the explanation by themselves and are the most parsimonious simplest most elegant model.</p>
<p>CH,
In thinking more about your analysis, I went back and checked a copy of the 1999 USNWR and compared it with the 2009 results. I was curious to see how PA scores have changed and measure changes in a couple of major contributing factors in your model (grad rates and SAT scores) and see how that affected their PA score. </p>
<p>All of the data is posted below, but I see only one school that enjoyed a big boost in its PA score while delivering a big boost in both its grad rates and SAT scores. That school is USC which saw its PA rise from 3.6 to 3.9. </p>
<p>The school that probably gets treated the worst by current PA is Vanderbilt. That school has achieved enormous gains in both 6-Year Graduation Rates (+10%, 5th best improvement of the 40 schools that I looked at) and in SAT scores (+90 points for mid-point of 25/75 and +110 points for 75th percentile level—both of these are again in the top 5 of those measured). Somehow, despite this performance, Vanderbilt’s PA score went DOWN from 4.1 in 1999 to 4.0 in 2009. Maybe you or someone else can help explain that one.</p>
<p>Here are the results:</p>
<p>Change in PA from 1999 to 2009 (USNWR Top 40 national universities)</p>
<p>Change in PA , 2009 PA , 1999 PA , College</p>
<p>0.3 , 3.9 , 3.6 , USC
0.2 , 3.5 , 3.3 , Wake Forest
0.1 , 4 , 3.9 , Georgetown
0.1 , 4.5 , 4.4 , U Penn
0.1 , 3.8 , 3.7 , NYU
0.0 , 4.8 , 4.8 , Yale
0.0 , 4.1 , 4.1 , Wash U
0.0 , 3.8 , 3.8 , UCSD
0.0 , 4.7 , 4.7 , UC Berkeley
0.0 , 4.3 , 4.3 , U Virginia
0.0 , 3.6 , 3.6 , Tufts
0.0 , 4.9 , 4.9 , Stanford
0.0 , 4.8 , 4.8 , Princeton
0.0 , 3.9 , 3.9 , Notre Dame
0.0 , 4.9 , 4.9 , MIT
0.0 , 4.9 , 4.9 , Harvard
0.0 , 4 , 4 , Georgia Tech
0.0 , 3.5 , 3.5 , Boston Coll
-0.1 , 3.7 , 3.8 , W&M
-0.1 , 4 , 4.1 , Vanderbilt
-0.1 , 4.2 , 4.3 , UCLA
-0.1 , 4.4 , 4.5 , U Michigan
-0.1 , 4 , 4.1 , U Illinois
-0.1 , 3.2 , 3.3 , Lehigh
-0.1 , 4.5 , 4.6 , Columbia
-0.1 , 3.4 , 3.5 , U Rochester
-0.1 , 3.9 , 4 , Emory
-0.1 , 4.1 , 4.2 , U North Carolina
-0.1 , 4.6 , 4.7 , U Chicago
-0.1 , 4.3 , 4.4 , Dartmouth
-0.1 , 4.1 , 4.2 , Carnegie Mellon
-0.1 , 4.6 , 4.7 , Caltech
-0.2 , 4.4 , 4.6 , Duke
-0.2 , 4.1 , 4.3 , U Wisconsin
-0.2 , 4 , 4.2 , Rice
-0.2 , 4.3 , 4.5 , Northwestern
-0.2 , 4.5 , 4.7 , Johns Hopkins
-0.2 , 4.5 , 4.7 , Cornell
-0.2 , 4.3 , 4.5 , Brown
-0.2 , 3.5 , 3.7 , Brandeis</p>
<p>Change in 6-Yr Grad Rate , 2009 6-Yr Grad Rate , 1999 6-Yr Grad Rate , College</p>
<p>16% 85% 69% USC
15% , 87% , 72% , Carnegie Mellon
13% , 84% , 71% , NYU
11% , 90% , 79% , UCLA
10% , 91% , 81% , Vanderbilt
10% , 78% , 68% , Georgia Tech
7% , 90% , 83% , U Chicago
7% , 80% , 73% , U Wisconsin
7% , 88% , 81% , UC Berkeley
6% , 89% , 83% , Caltech
6% , 92% , 86% , Wash U
6% , 88% , 82% , U Michigan
6% , 88% , 82% , Brandeis
6% , 91% , 85% , Boston Coll
5% , 88% , 83% , Emory
5% , 95% , 90% , U Penn
5% , 84% , 79% , UCSD
4% , 93% , 89% , MIT
4% , 93% , 89% , Georgetown
4% , 89% , 85% , Wake Forest
4% , 81% , 77% , U Rochester
4% , 94% , 90% , Columbia
3% , 93% , 90% , Northwestern
3% , 91% , 88% , Rice
3% , 89% , 86% , Tufts
3% , 92% , 89% , W&M
3% , 95% , 92% , Stanford
3% , 95% , 92% , Brown
3% , 82% , 79% , U Illinois
2% , 92% , 90% , Cornell
2% , 94% , 92% , Duke
2% , 95% , 93% , Notre Dame
2% , 83% , 81% , Lehigh
1% , 91% , 90% , Johns Hopkins
1% , 93% , 92% , U Virginia
0% , 97% , 97% , Harvard
0% , 96% , 96% , Yale
-1% , 93% , 94% , Dartmouth
-1% , 95% , 96% , Princeton
-1% , 83% , 84% , U North Carolina</p>
<p>Change in Mid-point of 25/75 , 2009 SAT 25 - 2009 SAT 75 , 2009 Mid-Pt vs 1999 SAT 25 - 1999 SAT 75 , 1999 Mid-pt , College</p>
<p>150 , 1370 - 1530 , 1450 vs 1210 - 1390 , 1300 , Wash U
150 , 1270 - 1460 , 1370 vs 1110 - 1330 , 1220 , USC
90 , 1300 - 1480 , 1375 vs 1200 - 1370 , 1285 , Vanderbilt
85 , 1340 - 1490 , 1410 vs 1240 - 1410 , 1325 , Tufts
80 , 1240 - 1390 , 1310 vs 1137 - 1323 , 1230 , Lehigh
75 , 1300 - 1510 , 1395 vs 1230 - 1410 , 1320 , Notre Dame
70 , 1330 - 1530 , 1425 vs 1250 - 1460 , 1355 , U Chicago
70 , 1210 - 1400 , 1295 vs 1120 - 1330 , 1225 , U North Carolina
65 , 1330 - 1530 , 1430 vs 1270 - 1460 , 1365 , U Penn
65 , 1360 - 1540 , 1435 vs 1270 - 1470 , 1370 , Columbia
55 , 1340 - 1540 , 1445 vs 1300 - 1480 , 1390 , Duke
55 , 1330 - 1530 , 1440 vs 1290 - 1480 , 1385 , Brown
55 , 1300 - 1490 , 1390 vs 1230 - 1440 , 1335 , Georgetown
55 , 1240 - 1430 , 1335 vs 1190 - 1370 , 1280 , Boston Coll
50 , 1400 - 1590 , 1485 vs 1340 - 1530 , 1435 , Yale
50 , 1350 - 1520 , 1410 vs 1270 - 1450 , 1360 , Northwestern
50 , 1220 - 1420 , 1320 vs 1160 - 1380 , 1270 , U Michigan
50 , 1250 - 1450 , 1340 vs 1190 - 1390 , 1290 , W&M
45 , 1180 - 1430 , 1295 vs 1140 - 1360 , 1250 , UCLA
45 , 1280 - 1460 , 1360 vs 1230 - 1400 , 1315 , Brandeis
40 , 1390 - 1580 , 1480 vs 1350 - 1530 , 1440 , Princeton
40 , 1290 - 1500 , 1385 vs 1250 - 1440 , 1345 , Cornell
40 , 1290 - 1490 , 1395 vs 1250 - 1460 , 1355 , Carnegie Mellon
40 , 1170 - 1410 , 1290 vs 1150 - 1350 , 1250 , U Illinois
30 , 1300 - 1470 , 1385 vs 1280 - 1430 , 1355 , Emory
30 , 1230 - 1420 , 1325 vs 1210 - 1380 , 1295 , U Rochester
30 , 1170 - 1380 , 1275 vs 1140 - 1350 , 1245 , U Wisconsin
25 , 1470 - 1580 , 1520 vs 1420 - 1570 , 1495 , Caltech
25 , 1330 - 1550 , 1450 vs 1330 - 1520 , 1425 , Dartmouth
20 , 1200 - 1420 , 1325 vs 1200 - 1410 , 1305 , U Virginia
20 , 1240 - 1410 , 1320 vs 1210 - 1390 , 1300 , Wake Forest
20 , 1240 - 1430 , 1310 vs 1190 - 1390 , 1290 , NYU
15 , 1290 - 1500 , 1390 vs 1290 - 1460 , 1375 , Johns Hopkins
10 , 1130 - 1360 , 1250 vs 1140 - 1340 , 1240 , UCSD
5 , 1400 - 1590 , 1490 vs 1390 - 1580 , 1485 , Harvard
5 , 1310 - 1530 , 1435 vs 1330 - 1530 , 1430 , Rice
5 , 1220 - 1470 , 1325 vs 1200 - 1440 , 1320 , UC Berkeley
0 , 1340 - 1550 , 1440 vs 1340 - 1540 , 1440 , Stanford
-5 , 1380 - 1560 , 1470 vs 1390 - 1560 , 1475 , MIT
-5 , 1240 - 1420 , 1315 vs 1230 - 1410 , 1320 , Georgia Tech</p>
<p>Change , 2009 SAT 75th , 1999 SAT 75th , College</p>
<p>140 , 1530 , 1390 , Wash U
130 , 1460 , 1330 , USC
110 , 1480 , 1370 , Vanderbilt
100 , 1510 , 1410 , Notre Dame
80 , 1490 , 1410 , Tufts
70 , 1530 , 1460 , U Penn
70 , 1540 , 1470 , Columbia
70 , 1530 , 1460 , U Chicago
70 , 1520 , 1450 , Northwestern
70 , 1430 , 1360 , UCLA
70 , 1400 , 1330 , U North Carolina
67 , 1390 , 1323 , Lehigh
60 , 1590 , 1530 , Yale
60 , 1540 , 1480 , Duke
60 , 1500 , 1440 , Cornell
60 , 1460 , 1400 , Brandeis
60 , 1450 , 1390 , W&M
60 , 1430 , 1370 , Boston Coll
60 , 1410 , 1350 , U Illinois
50 , 1580 , 1530 , Princeton
50 , 1530 , 1480 , Brown
50 , 1490 , 1440 , Georgetown
40 , 1500 , 1460 , Johns Hopkins
40 , 1470 , 1430 , Emory
40 , 1420 , 1380 , U Michigan
40 , 1430 , 1390 , NYU
40 , 1420 , 1380 , U Rochester
30 , 1550 , 1520 , Dartmouth
30 , 1470 , 1440 , UC Berkeley
30 , 1490 , 1460 , Carnegie Mellon
30 , 1380 , 1350 , U Wisconsin
20 , 1410 , 1390 , Wake Forest
20 , 1360 , 1340 , UCSD
10 , 1590 , 1580 , Harvard
10 , 1550 , 1540 , Stanford
10 , 1580 , 1570 , Caltech
10 , 1420 , 1410 , U Virginia
10 , 1420 , 1410 , Georgia Tech
0 , 1560 , 1560 , MIT
0 , 1530 , 1530 , Rice</p>
<p>Collegehelp, I don’t have this list… but if you can get a list of universities with the most academic programs ranked in the top 10 and maybe most universities with programs ranked in the top 3…you might also find a relationship between the schools and PA.</p>
<p>Schools like Caltech and the larger universities… make it tough because they are so different. </p>
<p>If you can get that info, play around with it.</p>
<p>It looks obvious to me that schools like Stanford, Berkeley and Michigan are being rewarded for the depth and breadth of their programs.</p>
<p>Obviously schools like CalTech are being rewarded for something else…excellence in a small number of programs.</p>
<p>I don’t know. Maybe you can award 10 points for the highest ranked program, down to 1 point for the tenth highest and see where you end up.</p>
<p>
</p>
<p>Easy. In the judgment of its peers, Vanderbilt has not kept pace in strengthening its faculty. I find collegehelp’s correlations interesting insofar as they generally confirm the wisdom of the dominant strategy of college and university administrators from time immemorial: strengthen the faculty, and money and top students (and selectivity) will follow. But not all schools follow that strategy. Some try to take shortcuts by luring top students and money without materially improving the faculty. This alternative strategy can help them in the US News rankings in the short term; indeed, it’s often a cheaper and easier way of gaining ground in the US News rankings. But it doesn’t improve their standing in the eyes of their peers, and I can’t help but believe that in the long term, the latter will matter more as it will determine where their graduates go to grad school and get hired, and whether top faculty appointments candidates form elsewhere are drawn to the school. And sooner or later, that’s got to filter down to students and applicants. </p>
<p>I don’t say the US News PA ratings are perfect, but I do think they’re a pretty good rough approximation of the views of academics generally. Consequently, schools that make big jumps in “objective” US News factors but not in PA are probably the schools that are most heavily invested in manipulating their statistics so as to increase their US News rankings, but are not, in the estimation of their peers, making the real investments in human capital that would be necessary to improve their standing in the academic community. That Vanderbilt should be in this group doesn’t surprise me. USC, Wake Forest, Tufts, Georgetown, and Notre Dame are also heavily invested in this game. The strategy produces gaudy US News stats that in turn produce even more applicants and even greater selectivity. But it’s not at all clear it improves academic quality.</p>
<p>
</p>
<p>This was the strategy of WUSTL, which was considered a good university but not an elite one. When it appeared in the top ranks of USNWR, suddenly students who would never have considered it (for various reasons) started making it their dream school. I don’t know WUSTL’s PA at this point, but I do know that it has lagged behind the other factors such as SATs and selectivity. In my mind, WUSTL has played the game of USNWR rankings, and won.</p>
<p>bc,
My post above was in response to collegehelp’s theory that the PA can be predicted based upon certain criteria. Two of those that factored prominently in his analysis were graduation rates and SAT scores. He never mentioned anything about faculty and whether that has improved or not. I don’t think that was his point or his claim.</p>
<p>As far as the changes in the last ten years, you see from the data that there is very, very little movement in the PA scores. Almost no one is surprised by this and lots of folks on CC have posted extensively about the forces that perpetuate the academic industry status quo. Many of us feel that the PA is a fraud and is used only as a “leveling” tool by USNWR to placate defenders of some historically prominent academic industry powers. CH was trying to legitimize its subjective conclusions thru his regression analysis of objective data. It’s an interesting attempt, but I think it fails when you actually backtest and discover that, even if institutions change markedly (up or down) in the data, changing their place in the minds of those in entrenched academia is next to impossible. </p>
<p>As for your statement about schools “manipulating their statistics so as to increase their US News rankings,” I trust that you will present some evidence to support your claim. Without substantiation, your statement is speculation at best and slander at worst. </p>
<p>The rankings are rife with schools (including some close to your heart) sharply improving certain important ranking elements over the last 10 years. Should they all be accused of manipulation or is it possible that they actually improved in those areas?</p>
<p>Mom- you are exactly right. WashU has won the game. Sadly, this shouldnt be made into a game. It has made kids and families dismiss “fit” and now look to “prestige” and “rank” when making a very important decision. CC is filled with these questions, which a generation ago would be laughable. USNWR has exploited parents fears and pride and made a ton of money along the way. Colleges arent commodities which can be marketed and quantified. Yet sadly here we are.</p>
<p>This is a very interesting discussion.
hawkette -
Thats really interesting data…a lot of work. I am not sure why PA sometimes does not keep up with changes in statistics but bclintonk may have a point. PA is complex and is related to a lot of interconnected factors. Maybe judgements shouldn’t change reflexively because overall quality doesn’t change reflexively. Should PA change in the other direction if a college has one bad recruiting year? I repeat…the current PA is predictable based on current data.</p>
<p>dstark-
I’ll try adding the number of different majors when I have a chance. I am not sure where to get the number of “distinguished” majors, however. The idea that comes to mind is using the Gourman Report. Everybody would love that discussion. Maybe there is something in the NRC data?</p>
<p>I have heard administrators and faculty at more than one university question new initiatives based on how it will impact USNWR rankings. At one such school, a faculty member actually questioned the wisdom of trying to get a more diverse campus because then SAT scores would drop some (because of historically lower SAT scores among the economically disadvantaged and URMs) and that would drop the school in the rankings. When administrative decisions must take into account a magazine’s view of them, things have gotten out of hand.</p>
<p>I know that Princeton’s decision to do away with ED had a USNWR element to it because it was understood that “selectivity” would decrease and thus be reflected in USNWR. Princeton is high enough that it doesn’t have to worry whether it is number one or number five, but other universities do have to worry. To go from number 20 to 26, or from number 50 to 51, in this age of quantitative rankings can impact the quality of students who decide to attend. It’s ridiculous.</p>
<p>I like this one:
<a href=“http://www.grad.berkeley.edu/publications/pdf/usnews_rankings_2008.pdf[/url]”>http://www.grad.berkeley.edu/publications/pdf/usnews_rankings_2008.pdf</a></p>
<p>:D</p>
<p><em>I know I’ll be flamed for posting grad rankings</em></p>
<p>Hawkette,</p>
<p>What do you make of the fact that most of the PA changes have been in a negative direction?</p>
<p>It lends credence to the idea that administrators are voting down competing institutions. (not Hawkette)</p>
<p>
</p>
<p>No, it doesn’t. Just because individual schools care about their ranking, it doesn’t mean that they cheat to get it.</p>
<p>Thanks, lockn.</p>
<p>I don’t get the allegation that there has been “very, very little change” in PA scores. hawkette’s own analysis shows that less than 1/3 of the schools in her list have maintained their rating over the ten-year period shown.</p>
<p>What is curious to me is why nearly all changes have been in a negative direction. I guess you could argue a ceiling effect, but I am not sure how far that would get you because it’s not the highest-rated colleges which have declined. </p>
<p>Whatever the explanation (and I hope it will be discussed more), it pokes a hole in the theory that there is this VAST CABAL of conspiratorial academics worshiping of the status quo.</p>
<p>Perhaps I just don’t understand how the cult works.</p>
<p>hoedown,
I noticed that downward trend as well. I don’t know why. A new re-centering?? I also noticed that, even though the PA scores are slightly lower than 10 years ago, the relative position of the vast majority of schools is mostly unchanged. </p>
<p>Re the lower PA scores, I sure as heck hope that it’s not due to folks voting in a fashion to harm others. Stupid. Unethical. Probably happens more than any of us would like. And it’s probably on the increase as more and more educators around the USA complain about this particular aspect of the USNWR rankings methodology.</p>
<p>
</p>
<p>And possibly dangerous, if the cabal gets wind of it! LOL</p>