US News 2008 Rankings- Predictions

<p>One could make a good case for BU to replace Tulane, Lehigh and Yeshiva in the top 50 on PA alone. BU's 3.4 for PA matches Tulane and beats Lehigh (3.2) and Yeshiva (3.0). But take the thumb-on-the-scale for peer (and recruiter) opinion global and BU definitely wins. </p>

<p>The 2006 THES (Times of London Higher Education Survey) ranks BU at #66 out of the world's 200 universities. Look only at American universities and it ranks #28. Not too shabby for an international survey. Case Western ranks #26 among American universities and Yeshiva is #172. Lehigh, Tulane and RPI don't make the top 200. About 40 percent of the total ranking score comes from "peer review" and another 10 percent from "recruiter review." If USNWR added a global survey such as this, BU would jump.</p>

<p>the THES ranking really could almost not have less to do with the academic quality of a school</p>

<p>And why is that?</p>

<p>because its methodology is based on employed researchers, faculty:student ratio, and amount of "internationality" at a school.</p>

<p>Care to argue any of the following?</p>

<p>UT - Austin > Brown
UT - Austin > Dartmouth
UT - Austin > Northwestern</p>

<p>URochester > Brown
URochester > Dartmouth</p>

<p>Penn State > Georgetown
UPitt > Georgetown</p>

<p>NYU > Brown
NYU > Dartmouth</p>

<p>UPitt > Rice
UWashington > Rice</p>

<p>Upitt > UVA
Penn State > UVA</p>

<p>Purdue > Tufts</p>

<p>University of Alabama > Notre Dame
Texas A&M > Notre Dame</p>

<p>and the list just goes on. Any ranking that comes to even one of those conclusions is obviously not a reliable system of ranking academic quality.</p>

<p>^ I wholeheartedly agree.</p>

<p>esl, THES is much more of a grad ranking, not an undergrad</p>

<p>^^ yes, i should have stipulated. i meant for undergraduate academic quality.</p>

<p>I had a feeling that stipulation was coming. </p>

<p>I agree that the THES methodology favors large general universities, but it doesn't focus exclusively on graduate programs. They are assessing the quality of the university, period. It's still a measure of quality, it's just not the one you're used to. (And keep in mind that I wasn't making any other claims for the use of THES other than that it supported an opinion that BU should move up in the better known USNWR ranking of "best colleges.")</p>

<p>From the website: "Forty-percent of the score allotted to each university is derived from "peer review" carried out among academics...involving gathering data from 3,703 academics around the world. Each was asked which area of academic life --- science, medicine, technology, the social sciences, or the arts and humanities --- they are expert in, and then asked to name up to 30 universities they regard as the top institutions in their area..." </p>

<p>Now it doesn't say that they're asking for opinions about graduate programs, just as USNWR asks for peer assessment in general and doesn't try to divine whether opinions come from respect for graduate programs or undergraduate teaching. (I believe there is debate regarding PA and that it too is heavily attributed to graduate schools reputations.) As stated, the peer review survey is 40 percent of the total score in THES. The only area where "graduate" is mentioned is in regard to the "recruiter review" portion of the survey; that is only 10 percent of the score. </p>

<p>The other HALF of the ranking methodology described on their website indicates that it is ranking the entire university, not graduate programs alone. </p>

<p>From the website: "The other half of the rankings scores are made up of quantitative measures... Teaching and research are the main activities that occur in universities. Measures designed to capture the quality of these activities account for 40 percent of the total score in our rankings. </p>

<p>"We measure teaching by the classic criterion of staff-to-student ratio....We ask universities to count people studying towards degrees or other substantial qualifications, not those taking short courses. We ask universities to submit a figure based on staff with some regular contractual relationship with the institution. A guest lecturer, however distinguished, should not count... The measure of staff-to-student ratio is intended to determine how much attention a student can hope to get at a specific institution, by seeing how well stocked it is with academic brainpower relative to the size of its student body. It accounts for 20 percent of the score. </p>

<p>"Our next measure, relating to research, is intended to examine how much intellectual power a university has relative to its size. It is based on citations of academic papers, since these are regarded as the most reliable measure of a paper's impact...Our analysis uses data covering 2001-2006. To compile our analysis, we divide the number of citations by staff numbers to correct for institution size and to give a measure of how densely packed each university is with the most highly cited and impactful researchers. This accounts for 20 percent of the total score.</p>

<p>"The increasing international nature of higher education is a key reason for the existence of the World University Rankings. The final 10 percent of our score is intended to determine how global universities are...5% of the score is for percentage of international faculty and 5% for percentage of international students. But because the measure counts for only 10 percent of the total score, it is not possible for an institution to do well in the overall table without being excellent in other categories. </p>

<p>"There are many measures we do not attempt to capture in these pages. We gather data on universities that teach undergraduates only...
We have considered a wide range of other criteria, such as graduate employment and entry standards, as possible quality measures, but these have all failed the test of being applicable evenly around the world."</p>

<pre><code>It is much more of a graduate program-influenced ranking, about 50 percent, but that doesn't mean it's results can be totally dissed, particularly regarding the minor point I was making. So there.
</code></pre>

<p>As a graduate of Biola University (undergrad, philosophy and theology, 04) who is also currently attending USC for grad school (MBA, 08) and will be attending Columbia for grad school as well (MSRED, 09) with plans to also attend Cornell for law (if they will have me) I'd like to say that my graduate school track record backs up what I am about to say: I could have gone to many top 20 universities straight out of highschool. I chose to attend Biola because of their extremely well respected theology and philosophy programs. Biola also has an honors program wherein the average SAT score is well above 1350 (old system) that offers a unique and arguably superior education to the large T.A. taught classes at other universities. My full ride didn't hurt my decision to attend the university either. Other people choose "lesser" universities all the time, for scholarships, the opportunity to play sports, location, because their family went there, etc.. Maybe a student wanted to attend a traditionally black university. Perhaps their major wasn't offered at the "prestigious" school close to home, but the state school offered it. Or, in my case, Biola offered a Christian perspective on philosophy and theology and that's what I wanted. </p>

<p>Gven my undergraduate fields of study, I wanted to learn from those who shared my a priori assumptions about the world rather than placing myself in a position of constant argument, so I chose to attend Biola because of it's unique evangelical perspective. This perspective has narrowed the scope of potential students and made it less competitive in terms of total student draw. Most 19 year olds don't want to go to a school that asks you not to drink, smoke, do drugs or have sex and requires chapel attendance. </p>

<p>QUITE OBVIOUSLY, a religious education isn't for everyone, and most people are content for one grounded in naturalism/postmodernism. I wouldn't have been, and so I chose Biola. I certainly could have attended NU - or better - so my point is simply this: Be careful about the judgments you make. People have a right to, and should have, pride in their alma mater, even if it is not among the mighty top 25. USC was barely in the top 100 fifteen years ago, and this year it is looking to break into the top 25. Aparrently, those who watched USC ascend the rankings the last decade or so have been foolish by your standards, but what would you say to them now - on the cusp of USC's upcoming achievement? Now you scorn me?Am I not to be happy that Biola is almost certain to break into the 3rd tier and begin to position itself as a top 120? Certainly I can be proud that it is the only truly religious Christian university to hold the distinction of being a national university and now, it may break its own ranking record. Certainly it is nothing to a great many people, but it means a great deal to the 60,000 + living alumni of the university. </p>

<p>So it's not a top 20 - so what? It's got a +12 overperform and it's making solid progress, with record breaking numbers of applications 5 years in a row, rising SAT scores, and an improvement in rankings that suggests it is doing so at a faster pace than its peer institutions. It's certainly no Harvard, or even StonyBrook (a fine university by the way) but it does offer a unique education for a certain type of person and it does a damn fine job of it. Biola's Philosophy program is extremely well respected within it's field as a matter of fact, and it's a shame there isn't a ranking to quantify it or else you'd probably really have a hard time of things (this is similar to local school Loyola Marymount, not even a national university :: GASP :: that has a very well respected law school) ans the list of exceptions to your silly idealogies go on. </p>

<p>So balls on you and hooray for Biola!</p>

<p>p.s. Just a little note: In only the last year, Biola was a feature story on Nightline </p>

<p><a href="http://abcnews.go.com/Video/playerIndex?id=2010969%5B/url%5D"&gt;http://abcnews.go.com/Video/playerIndex?id=2010969&lt;/a&gt;&lt;/p>

<p>A feature story in the NY Times:</p>

<p><a href="http://www.biola.edu/news/articles/040904_newyorktimes.cfm%5B/url%5D"&gt;http://www.biola.edu/news/articles/040904_newyorktimes.cfm&lt;/a&gt;&lt;/p>

<p>And recently a feature story in Los Angeles Magazine as well:</p>

<p><a href="http://goliath.ecnext.com/coms2/gi_0199-6305485/In-god-we-trust-by.html%5B/url%5D"&gt;http://goliath.ecnext.com/coms2/gi_0199-6305485/In-god-we-trust-by.html&lt;/a&gt;&lt;/p>

<p>
[quote]
It's got a +12 overperform and it's making solid progress

[/quote]
</p>

<p>MIT and Caltech have underperforms, why USNews chooses to include this in their rankings methodology is beyond me, that likely just means the school is very easy</p>

<p>elsijfdl,
I don't know if USNWR discloses how they do the differential measurement, but my understanding is that their formula makes allowances for the quality of the student body that they enroll and then predictions on what rates of graduation should follow. Clearly, this does not take into account the grading patterns and institutional commitment of the school to get its students out of there (advantage Ivies, disadvantage to MIT, Caltech, etc). But the bar is pretty high for these schools anyway because the FT/FY enrolling student at the Ivies is expected to graduate. </p>

<p>As for how this plays out more broadly, some see this as a VERY valuable measure because it shows some relationship between what a school receives in the form of a new student and his/her ability to complete a degree in 6 years (albeit an approacch that has no quality controls). Still, this can be a way for schools to show that they do a lot (or not) with what they get. </p>

<p>Here are the rankings for differential among the USNWR Top 20:</p>

<p>Rank, School, Differential between Predicted and Actual 6-Year Grad Rate</p>

<p>1 Notre Dame 5
2 Harvard 4
3 Princeton 3
4 Vanderbilt 3
5 Cornell 2
6 Northwestern 2
7 Brown 2
8 Yale 1
9 Stanford 1
10 Columbia 1
11 Wash U StL 1
12 J Hopkins 1
13 U Penn none
14 Duke none
15 U Chicago none
16 Dartmouth none
17 MIT -1
18 Rice -3
19 Emory -4
20 Cal Tech -6</p>

<p>Personally, I consider this a lot more value-added than the freshman retention rate and roughly equal to the absolute 6-year graduation rate.</p>

<p>^^ yes, hawkette, it looks at SAT/ACT scores and expenditure per student and from them derives an expected graduation rate, and then compares that to the actual gaduation rate.</p>

<p>if you read the Stanford's president's letter to USNews (<a href="http://www.stanford.edu/dept/pres-provost/president/speeches/961206gcfallow.html)%5B/url%5D"&gt;http://www.stanford.edu/dept/pres-provost/president/speeches/961206gcfallow.html)&lt;/a>, he attacks this 'graduation performance' system on the basis that it does a terrible job of assessing "value added" by a school. his words are better than mine:</p>

<p>"The California Institute of Technology offers a rigorous and demanding curriculum that undeniably adds great value to its students. Yet, Caltech is crucified for having a "predicted" graduation rate of 99% and an actual graduation rate of 85%. Did it ever occur to the people who created this "measure" that many students do not graduate from Caltech precisely because they find Caltech too rigorous and demanding - that is, adding too much value - for them?"</p>

<p>he then goes on to say (and i agree with him):</p>

<p>"Caltech could easily meet the "predicted" graduation rate of 99% by offering a cream-puff curriculum and automatic A's. Would that be adding value? How can the people who came up with this formula defend graduation rate as a measure of value added"</p>

<p>perhaps i should be defending value added, my school is very near the top five on that rank you provided, but even though i do believe my school adds a great deal of value, i believe this particular way of measuring it is wholly ineffective.</p>

<p>In a perfect world, I think that this measure would be of high value. In a world of different grading practices and institutional approaches, the results are potentially very suspect. USNWR was clearly hoping for the former and, as you point out, it may have gotten the latter.</p>

<p>i really hope georgetown makes it into the top 20 this year!</p>

<p>
[quote]
i really hope georgetown makes it into the top 20 this year!

[/quote]
</p>

<p>i hope so too, i feel it is definitely a contender for a spot with any school in the top 20.</p>

<p>and yes, hawkette, i agree. the best way i could see measuring this (a plan which is highly improbable) would be for entering college students to take a standardized test focused on their declared major (assuming they don't switch) and on general studies, and then at graduation take that same test and measure improvement. The school whose students see the greatest improvement obviously added the most value, however this would favor schools whose students knew very little coming in (diminishing marginal returns) so a factor accounting for strength of final score could also be implemented. Complicated and impractical maybe, but i feel like that is the best possible approach.</p>

<p>all right, I'll walk the plank, here's my list, largely unfounded, just guesswork :), (these posts are supposed to be usnews 2008 predictions afterall)</p>

<p>1) Harvard
1) Princeton
3) Yale
4) Stanford
5) Cal Tech
6) U Penn
7) MIT
8) Columbia
9) Dartmouth
10) Duke
11) Cornell
12) U Chicago
13) Northwestern
14) Washington U in St. Louis
15) Johns Hopkins
16) Brown
17) Rice
18) Emory
19) UC Berkeley
20) Carnegie Mellon
20) Vanderbilt
22) Georgetown
22) Notre dame
24) UCLA
25) USC
26) Tufts
26) University of Virginia
28) University of Michigan, Ann Arbor
29) UNC
30) NYU</p>

<p>in that case, here's my prediction:</p>

<ol>
<li>Harvard</li>
<li>Princeton</li>
<li>Yale</li>
<li>Caltech</li>
<li>MIT</li>
<li>Stanford</li>
<li>Duke</li>
<li>UPenn
9 Dartmouth</li>
<li>Columbia</li>
<li>Northwestern</li>
<li>UChicago</li>
<li>Cornell</li>
<li>Brown</li>
<li>WashU</li>
<li>Rice</li>
<li>Johns Hopkins</li>
<li>Vanderbilt</li>
<li>Notre Dame</li>
<li>Georgetown</li>
<li>Carnegie Mellon</li>
<li>Emory</li>
<li>UVA</li>
<li>Berkeley</li>
<li>Michigan</li>
<li>Tufts</li>
</ol>

<p>confidentialcoll,</p>

<p>Any reason why Duke and Chicago dropped in your predictions?</p>

<p>And this is a bit off topic, but why is it that Brown continually ranks lowest among the Ivies?</p>

<p>Because they're a bunch of damn no-curriculum hippies, that's why!!</p>

<p>And we're all secretly jealous ;)</p>

<p>I really don't see why other schools don't steal Brown's thunder by dropping all curriculum requirements</p>

<p>lol, because its name is Brown...jk</p>