Undergraduate Education: Ranking What Counts

<p>For the schools ranked in the USNWR Top 50 National University, I re-created the rankings by category and then made two important changes. First, I substituted the Classroom Teaching ranking (instead of the divisive Peer Assessment ranking). Second, I eliminated Alumni Giving. Otherwise, I used USNWR's 2009 ranking data and weights as follows:</p>

<p>25% USNWR Teaching Excellence Rankings
20% Graduation & Retention Rankings
15% Faculty Resources Rankings
15% Selectivity
10% Financial Resources</p>

<p>Note: this adds up to 95%. A raw score was created and then the schools were ranked next to the top college (Princeton). Here is how these 50 schools compared:</p>

<p>Rank , College</p>

<p>1 , Princeton
2 , Yale
3 , Harvard
4 , Stanford
5 , Duke
6 , Dartmouth
7 , Brown
8 , Caltech
9 , U Penn
10 , Northwestern
11 , U Chicago
12 , Wash U
13 , Rice
14 , Notre Dame
15 , Columbia
16 , MIT
17 , Emory
18 , Vanderbilt
19 , Cornell
20 , Johns Hopkins
21 , Tufts
22 , Georgetown
23 , U Virginia
24 , Wake Forest
25 , Carnegie Mellon
26 , USC
27 , UC Berkeley
28 , UCLA
29 , Brandeis
30 , NYU
31 , U Rochester
32 , W&M
33 , Boston Coll
34 , U North Carolina
35 , Lehigh
36 , Case Western
37 , Yeshiva
38 , Rensselaer
39 , U Michigan
40 , UC Irvine
41 , UC Santa Barbara
42 , UCSD
43 , Tulane
44 , U Wisconsin
45 , Georgia Tech
46 , U Illinois
47 , UC Davis
48 , U Florida
49 , U Washington
50 , U Texas
51 , Penn State</p>

<p>Eh, I think I would boost the importance of financial resources and decrease selectivity…</p>

<p>Once again, when did USNWR ever measure “undergraduate teaching excellence”?</p>

<p>Hawkette, so in your ranking, a school gets 25% total credit if they happened to be named on a 1995 USNWR list that asked for schools with a “strong dedication to undergraduate teaching”?</p>

<p>Strong dedication =/= strong performance. You can be dedicated to something but still suck.</p>

<p>I don’t like the emphasis on Graduation & Retention Rankings. Many excellent schools are simply more challenging than some students think, and they learn the hard way.</p>

<p>One point of clarification. In the 1995 USWNR ranking about Commitment to Undergraduate Teaching rankings, only 25 schools were ranked. As a result, I had to assign a ranking to all others. So as to not completely trash the unranked schools, I assigned all a ranking of 30th. Not great, but not awful either and better for 20 of the 25 that were unranked. </p>

<p>As for the individual weights used by USWNR, I disagree with many of them. For purposes of making this comparison, however, I do not want to get sidetracked, but rather show the impact of eliminating the two most notorious measurements (Peer Assessment and Alumni Giving) and the addition of one measure that hardly anyone can argue against (an institution’s commitment to delivering a quality experience in the classroom).</p>

<p>

Wrong. The schools were listed in order of most responses.</p>

<p>

But weren’t the people answering this question the same biased academics you often complain “have no idea” with what goes on in another university’s classroom?</p>

<p>ucb,
I fully accept your argument that the subjective nature of the responses undermines the conclusiveness of the final result. However, at least in the case of the Teaching ranking, it’s much clearer just what the “voters” are being asked to vote on. </p>

<p>As for the rankings above, I suggest that you and others think a little about the schools on the list. Most have exceptionally good reputations AMONG STUDENTS and AMONG GRADUATES for what goes on in the classroom. I think that this is the key difference with the PA-based USNWR rankings. I care most about the student experience and not how some unknown academic judges (guesses!) the academic quality of programs, large and small and hugely different, all across the nation. My belief is that the rankings above are a better representation of the undergraduate academic experience that a typical student will encounter. </p>

<p>Ideally, I think that the USNWR rankings would be much improved by listing the PA rankings separately, re-instituting the Teaching rankings, and using only objective data to compile their actual Best Colleges ranking. The time may also have come for the publics to be ranked separately as their missions are commonly different from most privates and the same ranking scale is not appropriate.</p>

<p>HAHAHA, wow, this guy’s out of control. i started laughing as soon as i saw the title and it’s made by hawkette. yeshiva over michigan. this must be the magic formula!</p>

<p>

Really?

All of these schools offer a better undergrad classroom experience than Michigan?</p>

<p>

There are very, very good schools here at the bottom of your list.</p>

<p>

A lot of these schools are kinda limiting in what they offer academically - some lack breadth and depth.</p>

<p>I would trust a bunch of academics over a 20 yr. old thinking she knows more than she actually does… :D</p>

<p>Hawkette, as you know I’m new to CC and have become a fast fan of yours. Rumor has it you’re a Gator (me, UF undergraduate). </p>

<p>This is an interesting new wrinkle in rankings. However, after applying 2008 numbers to your SAT 700+ and ACT 30+ system, it seems to me any methodology/criteria that has UC SB, Irvine, or Davis ahead of UT, UF, GWU, UWash, or PSU has to be questioned. Maybe this list will change when you use 2008 (2010 USNWR) numbers.</p>

<p>once again, why does everything need to be “ranked”…can’t we just accept that there is no clear “winner” or “better school”</p>

<p>Isn’t “undergraduate teaching excellence” just another form of peer assessment? … and a very out-dated one indeed. What “objective data” was it based on? Btw, how many people were involved in the survey and what was the response rate? If a school is excellent in teaching in 1995, is it equally excellent in 2009? A lot can change in 15 years, just look at USC.</p>

<p>

But very few of the people surveyed were qualified to give an educated opinion. How do you know the “teaching excellence” of another department, much less another school? Just assume you know that teaching in the English department in your university is good, how do you know that it is equally good in the Engineering department? How do you even have a clue of the teaching quality in another school unless you have taught there recently.</p>

<p>At least for peer assessment, you have some idea of the quality of the university based on the reputation and quality of its programs, faculty, faciltities and financial strengths. Besides, overall peer assessment correlates well with department rankings.</p>

<p>hawkette, is there any way you can expand this system to include all the national university “top tier” schools? thanks!</p>

<p>Even USNWR doesn’t believe in it. That’s why it only did it once in 1995 and abandoned it.</p>

<p>^ exactly… I’m pretty sure USNews has considered it 25 years ago before most of us were born to rank based on teaching qualities rather than academic qualities… Of course it is a good idea, but there were better options to consider…</p>

<p>Kb,ucb, goblue, phead,
You’re playing defense…which is understandable given that your schools don’t fare as well in this “ranking.” I just wish you would make a more substantive effort to refute why the UNDERGRADUATE environment at any of your schools merits a position higher than the schools ahead of them. It’s not like the schools ahead of yours are weak and undeserving…</p>

<p>Look, I think I understand the USNWR methodology and its pitfalls as well as anyone on this board. I grant you that the Teaching metric is far from perfect and does need updating, but I am quite confident that it is a better tool for many, many, many prospective students in their college search & selection. Every ranking has its holes, but this collection of datapoints is more relevant to the average undergraduate student than the current USNWR formula. </p>

<p>My interest is in the environment that a school (public or private) creates/provides and how that environment impacts the student and his/her experience. IMO, the key building blocks for a great undergraduate academic setting are:

  1. Strong student body
  2. Small class sizes
  3. Great classroom teaching
  4. Deep financial resources and an institutional willingness to spend them on undergraduates</p>

<p>As for the rankings themselves, I suggest you review the list and give credit to the schools ranked ahead of yours. What is often unappreciated in these forum discussions is the high quality academic environment that many schools provide for their UNDERGRADUATE students. Acknowledging their strength does not mean that your school is bad. </p>

<p>On nearly all of the key elements, the best privates outperform the best publics and, in most cases, it’s not very close. ABC Highly Ranked Public may have a lot of programs and be well thought of within academia (probably due to a research reputation created mostly by grad students and profs less than laser-focused on undergrads), but that hype does not levitate them over schools that make undergraduate education their priority. </p>

<p>Pierre,
I don’t have the data on the schools outside of the Top 50 and that is why I don’t include them. It’s a shame because I suspect that there are many colleges just outside the top 50 that would compare very well with many ranked in the 40-50 category. Maybe next year I’ll extend the study group. Sorry.</p>

<p>

How is that fair to state schools which may have smaller endowments yet still receive significant support from their state government? Or how do you account for that?</p>

<p>hawkette</p>

<p>I find it hard to believe that you keep trotting out that 15 year old survey about teaching excellence, which was based on questioning the same academics whom you continually denigrate when you criticize the Peer Assessment numbers. It is illogical (not that that has ever stopped you before) to say that one survey is correct, but the other is not because the question was somehow more specific. As goblue81 as pointed out, we don’t know anything about what those surveyed knew or could have known about the subject of teaching excellence (one of your main objections to the peer assessment). However, if you believe that the school administrators and academics know something about the quality of teaching at numerous universities, then you should equally believe that these very same people are capable of assessing the overall academic strength of a school. </p>

<p>For those of you who are interested, the 15 year old survey was based on asking 2700 academics the 10 schools in their category where the faculty “has an unusually strong commitment to undergraduate teaching.” Hawkette has noted several times that the survey is dated and probably not that useful. For example, here’s what Hawkette said in Aug. 2007 about the Teaching Excellence survey

And in March of this year, Hawkette said

</p>

<p>Notwithstanding, this, hawkette keeps on running the numbers to come up with rankings and, just as in this thread, often replaces the peer assessment numbers with the outdated teaching excellence numbers, in spite of his/her claim that neither one of these numbers should be part of any rankings calculation.</p>