<p>Schools of Business
1. Harvard University (MA)
2. Stanford University (CA)
3. Northwestern University (Kellogg) (IL)
University of Pennsylvania (Wharton)
5. Massachusetts Institute of Technology (Sloan)
University of Chicago
7. University of California-Berkeley (Haas)
8. Dartmouth College (Tuck) (NH)
9. Columbia University (NY)
10. Yale University (CT)</p>
<pre><code>Schools of Education
1. Vanderbilt University (Peabody) (TN)
2. Stanford University (CA)
3. Teachers College, Columbia University (NY)
4. University of Oregon
5. University of California-Los Angeles
6. Harvard University (MA)
7. Johns Hopkins University (MD)
Northwestern University (IL)
University of California-Berkeley
University of Texas-Austin
University of Wisconsin-Madison
Schools of Engineering
1. Massachusetts Institute of Technology
2. Stanford University (CA)
3. University of California-Berkeley
4. Georgia Institute of Technology
5. University of Illinois-Urbana-Champaign
6. Carnegie Mellon University (PA)
7. California Institute of Technology
University of Southern California (Viterbi)
9. University of Michigan-Ann Arbor
10. University of Texas-Austin (Cockrell)
Schools of Law
1. Yale University (CT)
2. Harvard University (MA)
3. Stanford University (CA)
4. Columbia University (NY)
5. New York University
6. University of California-Berkeley
University of Chicago
8. University of Pennsylvania
9. University of Michigan-Ann Arbor
10. Duke University (NC)
Northwestern University (IL)
University of Virginia
Schools of Law (Part-time)
1. Georgetown University (DC)
2. George Washington University (DC)
3. Fordham University (NY)
4. American University (DC)
5. George Mason University (VA)
6. University of Maryland
7. Temple University (PA)
University of San Diego
9. University of Denver (Sturm)
10. Illinois Inst. of Technology (Chicago-Kent)
Schools of Medicine (Research)
1. Harvard University (MA)
2. Johns Hopkins University (MD)
3. University of Pennsylvania
Washington University in St. Louis
5. University of California-San Francisco
6. Duke University (NC)
Stanford University (CA)
University of Washington
Yale University (CT)
10. Columbia University (NY)
</code></pre>
<p>Let me see, USC is not rated top 10 in any engineering sub specialty at USNWR and yet it is rated number 7 at the grad school level for engineering. I’ll admit I don’t read the methodology used by USNWR, but this makes no sense to me whatsover. Grad school is all about specialization. How can USC be rated higher than Michigan if that is the case? There has to be something wrong here.</p>
<p>I wondered about the same and that led me to investigate and here’s what I found:</p>
<p>Step 1: quick glance of the data:
USC has the highest % faculty as NAE members
Both the peer and recruiter assessments for USC are out of top-25</p>
<p>Step 2: determines what jump out as unusual:
USC has higher % faculty than even Berkeley/Stanford/MIT(?)<br>
If USC does have that many NAE members, who come its peer assessment score is so low (that’s like Chicago having bunch of Nobels but low peer ratings for its economics)? How come none of its departments is in the top-10 (those NAE members must reside somewhere, not vacuum)?</p>
<p>Step 3: investigate the NAE %:
[USC</a> - Viterbi School of Engineering - Facts](<a href=“http://viterbi.usc.edu/about/facts.htm]USC”>http://viterbi.usc.edu/about/facts.htm)
It says, “More than a third of its 165 faculty members are fellows in their
respective professional societies. With 33 faculty members elected to the
National Academy of Engineering”.
Click the link “National Academy of Engineering” to see the list. The ones
in black are not faculty. Even among the ones in red, some are still not.
Even among the ones that are, some of them are no longer “active” but
emeritus (retired); they shouldn’t be counted for US News either. Now you
see there are <em>a lot</em> that shouldn’t be counted for US News… What’s more is the engineering school does have 165 active full-time faculty; so many of the people on this list are actually *excluded" from that 165. The % was therefore overstated further.
Another thing I discovered through looking at the bio is that the
engineering school got a few to join when they were already pretty old and
(presumably) retired. One example is Simon Ramo:</p>
<p>So what you are saying Sam Lee is that USNWR allows schools to completely job their numbers. I think this is really an indictment on the whole ranking scale used by them. I cannot believe that they get away with this and are not called to task more often.</p>
<p>UCLA and UCSB also sent wrong numbers (last year ranking; I don’t have this year’s copy). According to NAE website, UCLA and Northwestern really have the same number of NAE members. But in US News, the percentage was 14.1% for UCLA and only 5% for Northwestern (14.1% is only slightly lower than MIT’s %). The # full-time professors at both schools are roughly the same. </p>
<p>Although this category accounts for 7.5% and therefore looks insignificant on paper, the impact can be significant. This is probably because the difference in the score is very tiny among those at the top-30 so any clear advantage in a seemingly minor category gets magnified. In the case of Northwestern vs UCLA, Northwestern had better numbers in all categories except this NAE % (14.1% vs 5% when actually they are supposed to be about the same) . The end result? Northwestern was ranked 20th while UCLA was ranked 13th, instead of lower than Northwestern.</p>
<p>I like how Hopkins’ newly opened School of Education skyrocketed to top 10 out of nowhere. I was initially trying to search for it among the top 20 and top 30 and it wasn’t there… Five way tie at #7 is not bad compared to #24 last year (when it was called The School of Professional Studies in Business and Education)</p>
<p>GoBlue81,
It doesn’t stop there. See my post above yours. Even the NAE website numbers shouldn’t be used because it includes emeritus prof which can be a huge portion (over 50% isn’t uncommon). Only active full-time prof should be counted for USN (exclude emeritus, adjunct, part-time, alum, board of trustee…etc). So the number they should send to USN is supposed to be more like 10, not 22, and definitely not 33.</p>
<p>Only the ones in red are possible but read about them and you’ll see most of them are not really (industry people, provost, board of trustee, emeritus). But these ones are not included in the 165. So they do have 165 active full-time. So apparently, they know what full-time active means when it comes to the grand total. But then they don’t when it comes to counting NAE members for US News purpose.</p>
<p>The two rankings use different methodologies. In any case, the “specialty ranking” is just a popularity poll taken among deans of engineering colleges (many of whom are sometimes senior professors who are no longer very active in research). </p>
<p>Of course, the right way to evaluate graduate programs would be instead to consider faculty scholarly productivity. That’s why the FSP index is a much better indicator of quality at the graduate level than USN&WR.</p>
<p>Not for all programs. In some disciplines such as engineering there is a great disconnect between what people do in academia and what they do in industry. What industry wants is different than what academia wants, and faculty scholarly productivity doesn’t tell you anything about how industry feels about a school and, consequently, how easy it is to find a job at graduation. For that reason I think US News is just as valid in areas where graduates are not confined to academia.</p>
<p>IMO it’s a bit of both - the practical/trade school side of engineering and advancing knowledge. Which schools do both successfully? A research insitution with a committment to teaching seems to have the best of both worlds, but I haven’t seen such a ranking.</p>
<p>The FSP is one useful measure of faculties, but it’s got some limitations, too. First, it’s based on proprietary, non-public data and many universities have expressed concerns that they have no way of verifying the information on which it’s based, or for correcting errors. Second, it apparently makes no distinction between publishing in a prestigious journal and a third-rate one; each counts equally, even though one is obviously a more significant achievement than the other. Third, the greatest and most influential scholars are not necessarily the most prolific. Ludwig Wittgenstein, possibly the most influential philosopher of the 20 Century (in the English-speaking world, at any rate), basically published one book in his lifetime; another was published posthumously. Ronald Coase, the hugely influential University of Chicago economist and Nobel laureate, wrote only a small handful of articles over the course of an entire career. Over a long time span their influence would be captured by citation counts, but the FSP uses citation counts over such a short time-span, some 3 or 4 years, that both Wittgenstein and Coase would probably have been rated as having close to zero scholarly productivity at the peak of their most creative and original periods. Not to say the FSP is worthless, but like the US News PA rating it has to be taken with a grain of salt. It’s just one more highly imperfect data point.</p>
<p>bruno,
I agree that they don’t have to match very well but if you think about the methodologies carefully, a huge discrepancy would raise a red flag. This is what made people like rjkofnovi and I wonder; it prompted me to investigate at the first place.</p>