USWNR 2008 Undergraduate Rankings

<p>redcrimblue, only 58 percent of people surveyed respond, and many people who actually fill it out admit they aren't familiar with undergrads of all other colleges enough to rank appropriately.</p>

<p>By the way, I just think that smaller schools without strong grad programs are screwed over. Anyways, the PA score has been discussed alot on this board, and there are two very differing opinions on it.</p>

<p>
[quote]
For the record, ex-PA, Duke would be ranked #3 (behind HP) and tied with Yale and U Penn.

[/quote]
</p>

<p>Is there a ranking (or would you compile it hawkette) that has top universities and top LACs ranked without use of the PA score. That would be grand.</p>

<p>Rank Ex-PA, School, USNWR Rank with PA, Change</p>

<p>1 Princeton, 1, 0
1 Harvard, 2, 1
3 Yale, 3, 0
3 U Penn, 7, 4
3 Duke, 8, 5
6 Stanford, 4, -2
6 MIT, 4, -2
6 Wash U StL, 12, 6
6 Dartmouth, 9, 3
10 Cal Tech, 4, -6
10 Columbia, 9, -1
10 Northwestern, 14, 4
10 Brown, 15, 5
10 Notre Dame, 20, 10
15 Cornell, 12, -3
15 Rice, 17, 2
17 U Chicago, 9, -8
17 J Hopkins, 16, -1
17 Emory, 18, 1
17 Vanderbilt, 18, 1
21 Georgetown, 23, 2
21 Tufts, 27, 6
23 Carnegie Mellon, 21, -2
23 U Virginia ,24, 1
23 Wake Forest, 30, 7
23 Lehigh, 33, 10
27 USC, 27, 0
28 UC Berkeley, 21, -7
28 UCLA, 26, -2
28 U North Carolina, 27, -1
28 Brandeis, 31, 3
28 U Rochester, 34, 6
33 Case Western, 38, 5
34 U Michigan, 24, -10
34 W & M, 31, -3
34 Boston College, 34, 0
34 Yeshiva, 44, 10
38 NYU, 34, -4
38 UC SD, 38, 0
38 Tulane, 44, 6
41 Rensselaer, 42, 1
42 U Wisconsin, 34, -8
42 Georgia Tech, 38, -4
42 UC S Barbara, 47, 5
42 Syracuse, 52, 10
46 UC Irvine, 44, -2
47 U Illinois, UC 41, -6
48 U Florida, 47, -1
50 Penn State, 47, -3
51 UC Davis, 47, -4</p>

<p>Oh my. That is very interesting. Do you have the same information for LACs? And thank you for that post. Sheds an interesting light on the rankings and the true effect of the PA score.</p>

<p>Obvious standout schools on that list:</p>

<p>Penn, Duke, WashU, Notre Dame, Brown, Tufts (finally where it should be)...</p>

<p>I think this is the list people should really be looking at. Based on tangible data only, correct?</p>

<p>guess what?</p>

<p>this is dumb!</p>

<p>everyone knows that the whole system is flawed anyways...just wait till they come out and then slobber over them</p>

<p>Think of peer assessment as a measure of faculty fame. Two schools could have similar class sizes. However, one could have a much higher proportion of professors who are internationally reknown. </p>

<p>Most students would choose the school with the more famous faculty. Their teaching skills may not be better, but you have the opportunity to interact with leaders in the field and engage in cutting edge research. </p>

<p>PA is the most general way to measure this. Otherwise, US News will have to count the number of National Academy members or Nobel Prizes or some other research measure which will probably give a similar result.</p>

<p>I still don't really see how that's relevant, at least compared to the other factors in the formula. At the very least, being able to smell the fart of leaders in their fields but not necessarily take classes from or learn from them certainly isn't worth 25% of the overall score, IMHO.</p>

<p>I can't speak for all elite schools, but some have world reknown faculty that are actually involved in undergraduate education. You can smell their fart because they have invited your small group class over to their house for ribs (true story, minus the fart part) </p>

<p>There needs to be a measure of faculty quality and not just quantity which would be the case without the PA score.</p>

<p>Russ456,
While I concur with your desire to make some measurement of faculty quality, I think you fall into the trap that says that the only ones capable of judging a faculty are others in the academic community. Undoubtedly, many in the academic field are brilliant, make real contributions to research in their field and also serve as effective teachers. However, just how brilliant they are or how relevant their research work is or how effectively they teach can be in the eye of the beholder. And in the case of PA, we have only one concentrated set of beholders, ie, others in the academic community. </p>

<p>For every outstanding professor that I have met who is doing useful research that is relevant to the real world, I have probably met four who haven't got a clue about the business world and what is and is not useful information. Yet some members of this second batch of professors can often be ballyhooed as "world-class" by peers, but certainly would not be by people working in the fields in which they are associated. Throw in the near unanimity in leftist ideology that exists on college campuses today and the relevance of many, many academics is greatest only amongst themselves. </p>

<p>So, yes, please measure faculty, but involve some in this measurement who actually work in environments that have financial accountability, ie, employers who hire the students and sometimes the professors themselves to perform consulting work. They can tell you quickly just who is and who is not adding value. Having money on the line focuses the mind and provides a truer understanding of the application of an academic's work. This "reality check" would go a long way to either supporting the reputation of the academic or exposing him/her as not quite the "star" that other academics have proclaimed. Not to mention the benefit that students might receive from a faculty that is more attuned to real world problems and less to academic theory. </p>

<p>My words may read more harshly than I intend as I really don't feel anti-faculty. However, I wouldn't expect academics to ever go for this because it would cede power to outside groups in making the judgments about the worth of academic research. Still, I do believe that there is a missing accountability in the academic world. Bringing in non-academic evaluators of the quality of their work (both in the research and the classroom as I also would like to see student input into the evaluation and grading) would do much to improve the quality of the faculty grading that USNWR purports to do.</p>

<p>Hawkette,</p>

<p>If there were a reliable way to measure impact of faculty/students on the real world, that would be great. I think the Washington Monthly ranking tries to do this. However, since it does not rank Harvard, Yale, Princeton as the top 3, it seems to be less popular. That is the trap most people fall in (perhaps yourself)</p>

<p><a href="http://www.washingtonmonthly.com/features/2005/0509.collegeguide.html%5B/url%5D"&gt;http://www.washingtonmonthly.com/features/2005/0509.collegeguide.html&lt;/a&gt;&lt;/p>

<p>It actually makes sense that MIT would come out first because it focuses on research in science, engineering and business. These areas have a profound impact in academics, industry and Wall Street. Sorry Noam Chomsky (linguist extraordinaire) and John Harbison (pulitzer prize winning professor in music), though I loved your classes.</p>

<p>Based on your real world impact criteria (which I agree with) humanities oriented schools like Harvard and Yale should rank less highly on faculty quality. Even if you are a preeminent scholar on 13th century Slovakian poetry, it's hard to change the world with that knowledge.</p>

<p>Russ456,
I disagree with your conclusion above about how humanities programs would be evaluated. Actually, I think having a "real world" evaluation would do more than anything else to open people's eyes about the true value-added of a degree from a "prestigious" college. And this would almost certainly lessen the frenzy in college admissions as students will realize that they can succeed professionally coming out a great many undergraduate colleges. </p>

<p>Consider any number of industries outside of strict engineering or medical-related fields. For the vast majority of non-technical students (80%+ of the undergrad population at most schools) and who are majoring in some field unrelated to the industry that they will work in, then the reputation (particularly research-related in an unrelated field) of the faculty is of very low or zero importance to the employer. (One could argue that Business Majors also fall into this category and certainly the numbers of business students has increased over the last decade, but I would also argue that the financial employers are looking for bright students first and business majors second.)</p>

<p>Far more important to the employer is the student's critical thinking and communication skills and personal qualities and the level of preparedness of the student to work effectively in an office setting. Very hard to measure and very hard to ascertain the faculty's role in fostering this, but you can see this in the workplace with students from a variety of schools and academic backgrounds. Put a student from Harvard and a student from U North Carolina in the same program and you can judge many things-critical thinking skills, ability to gather, interpret, assimilate and apply information, ability to work with others, personal sense (or hopefully lack thereof) of entitlement, etc. Do this a few times over a few years and patterns emerge that can tell you a lot about the students coming out of X or Y school. You won't always get it right and people are very different, even coming out of the same school, but you can observe differences which go to building an overall picture of an institution and the product that it produces. </p>

<p>With regard to your earlier point about humanities, I think that employers appreciate the intellectual training that students receive in those fields far more than the research accomplishments that a certain professor may have achieved in various publications. The student who majored in psychology or Russian language or English or religion may be every bit as bright as one in the technical fields and perhaps their training has been more constructive as it applies to problem-solving in the business world and working effectively with others. Those coming from highly theoretical educational backgrounds may have great intellectual ability, but need to understand the application of these ideas and they may never develop the judgment involved with this. </p>

<p>IMO, there are very bright students all over the country and in some of the most unlikely places. Employers know this and also favor the local/regional hire over someone from a distant "elite" school. For example, an employer in southern California is VERY happy to hire a smart young person from UCLA or USC rather than a student from Brown or Dartmouth or U Penn or whatever eastern elite you want to name (with the exception of HYPSM which I truly believe are the only schools with true national recruiting power). This regionalism is a powerful force, but acknowledging it would be a potential negative for the "elite" schools as it would comparatively diminish the appeal of their diploma. Having an employer in San Diego or Santa Barbara or San Francisco opine on the quality of faculty (and/or the usefulness of their research) would be quite risky for the educational elites who have spent decades building their prestigious reputations. </p>

<p>Not sure how I got here, but the point originally was that the research efforts (and reputations in academic circles) of many in academia are unrelated to the futures that their students will pursue. In my mind, the key for students and employers is how well the students are taught to think and apply this thinking. This has far greater individual consequences (good and bad) than whether Professor Smith won an award from some academic group or whether Professor Jones had an article published in an obscure academic journal. </p>

<p>Presently, PA does not capture that "real world" view and I believe that this is one of many lacking aspects of the PA scoring. I concur with your concern about how you get a reliable ranking of employer viewpoints, but I also believe that employers probably will get it more right than wrong. Furthermore, their opinions would have more value to students and others trying to assess the value-added of a faculty rather than the views of unknown academics assigning grades based on unspecified and perhaps irrelevant factors.</p>

<p>I think it's stupid that they don't rank Harvard 1st, and even more if Harvard goes 1 more down. </p>

<p>Anyway I'm going to Duke and would be happy if it goes up 1.</p>

<p>no reason why lax would affect it this yr since it didn't really affect it last yr when it happened...now it's kinda forgotten...that being said, no reason duke doesn't move up a space or 2</p>

<p>best school = smartest kids IMO</p>

<p>That's why I thought the SAT depth percentiles (they were somewhere on this board a few weeks ago) were a good measure of school quality. A school where even the 25th percentile is in the high 600's/low 700's is going to be overflowing with knowledge spillover.</p>

<p>Here are the numbers for the USNWR Top 30 colleges for their % of students who scored over 700 on Critical Reading and on Math. </p>

<p>For Critical Reading</p>

<p>1 Yale 78%
2 Cal Tech 77%
3 Princeton 73%
4 Harvard 73%
5 MIT 68%
6 Columbia 67%
7 Dartmouth 65%
8 Brown 64%
9 Duke 63%
10 U Chicago 61%
11 Wash U StL 60%
12 Stanford 59%
13 Rice 57%
14 Tufts 56%
15 U Penn 54%
16 Northwestern 53%
17 Georgetown 53%
18 Notre Dame 47%
19 Vanderbilt 41%
20 J Hopkins 40%
21 Cornell 38%
22 USC 36%
23 Carnegie Mellon 33%
24 Emory 32%
25 UC Berkeley 31%
26 U Virginia 31%
27 Wake Forest 29%
28 U North Carolina 22%
29 U Michigan 21%
30 UCLA 21%</p>

<pre><code>Non Top 30 Schools that scored higher than some Top 30 Schools

William & Mary 44%
Brandeis 41%
Tulane 36%
NYU 30%
Boston College 26%
Case Western 25%
U Rochester 24%
</code></pre>

<p>For Math,</p>

<p>1 Cal Tech 96%
2 MIT 92%
3 Yale 78%
4 Wash U StL 76%
5 Princeton 74%
6 Harvard 74%
7 Carnegie Mellon 71%
8 Dartmouth 70%
9 U Penn 69%
10 Duke 68%
11 Stanford 67%
12 Brown 66%
13 Northwestern 63%
14 Rice 63%
15 Columbia 60%
16 J Hopkins 60%
17 U Chicago 57%
18 Cornell 59%
19 Tufts 59%
20 Notre Dame 57%
21 Vanderbilt 52%
22 Georgetown 51%
23 USC 50%
24 UC Berkeley 46%
25 Emory 45%
26 U Michigan 43%
27 U Virginia 40%
28 UCLA 39%
29 Wake Forest 38%
30 U North Carolina 29%</p>

<pre><code>Rensselaer 48%
U Illinois UC 46%
Georgia Tech 44%
Case Western 44%
Brandeis 43%
Boston College 40%
Lehigh 39%
NYU 36%
W & M 35%
U Rochester 31%
U Wisconsin 31%
UC SD 30%
</code></pre>

<p>LAC's anyone?</p>

<p>


What is the source for these numbers?</p>

<p>I did this some time ago and my memory is a little hazy, but I think it was from Yahoo Education. Look at any college and they will provide a breakdown of the test scores.</p>