<p>Oh, Level, I see you’ve been taken in by our famous and mysterious friend “posterX”. The report to which he is referring is not universally accepted as authoritative. Here are the links to each of the two disciplines in question showing data offered to the Chronicle of Higher Education by the private company that does this survey:</p>
<p><a href="http://chronicle.com/stats/productivity/page.php?primary=5&secondary=50&bycat=Go%5B/url%5D">http://chronicle.com/stats/productivity/page.php?primary=5&secondary=50&bycat=Go</a></p>
<p><a href="http://chronicle.com/stats/productivity/page.php?primary=5&secondary=51&bycat=Go%5B/url%5D">http://chronicle.com/stats/productivity/page.php?primary=5&secondary=51&bycat=Go</a></p>
<p>Though these rankings were reported in the Chronicle of Higher Education, they are not associated with that publication (something the poster does emphasize). Instead, they are the work of a private company called Academic Analytics of which the State University at Stony Brook is a partial owner. These are rankings not of the overall reputation or quality of particular departments, but of something the authors call the “Faculty Scholarly Productivity,” essentially a measure of how frequently faculty members appear in print or are cited. Unfortunately, the index analyzes only three years’ worth of journal articles (2003, 2004 and 2005) and five years worth of books (2001, 2002, 2003, 2004 and 2005). Since even the best scholars typically are not publishing every year (or a new book even every few years!) the severe limitation of the time period in which the data are gathered undermines their usefulness. Furthermore, you’ll read in the notes, that they used Amazon.com as their source for identifying published books, thus leaving it up to that company to decide what would be included in the study. You’ll also note that, to count the total number of faculty members, the company used a web crawler to go to university websites and attempt to extract the information. Apparently, this was not terribly successful so they tried to get each school in the study to correct the faculty lists they sent them but most refused to cooperate since it would have taken time to do and was for the benefit of a private company that charges $30,000.00 a year to each university that subscribes to its data. </p>
<p>To its credit, The Chronicle of Higher Education wrote what appears to be a fairly balanced examination of the work of this company. You’ll see that there is some praise (mostly from universities paying $30,000.00 a year for the service) but also some withering criticism of the techniques that have led to some very curious results such as the following in their ranking of English departments.</p>
<p><a href="http://chronicle.com/stats/productivity/page.php?primary=10&secondary=89&bycat=Go%5B/url%5D">http://chronicle.com/stats/productivity/page.php?primary=10&secondary=89&bycat=Go</a></p>
<p>(from the Chronicle’s article)</p>
<p>“But a close look at other data in the index has some college officials raising their eyebrows. The University of Georgia's No. 2 ranking in English, for instance, has caused some scoffing. Say you have an exceptionally bright undergraduate poised to enter graduate school in English, says one university administrator who did not want to be named: "Would you really recommend the person attend the University of Georgia? It's where this unidimensional figure gets out of touch."</p>
<p>Now, as an advocate for Princeton, which is ranked #1 in English by this study, you might find it odd that I should criticize their system. Look, however, at the top ten universities for English. I won’t quibble with the University of Georgia’s rank at #2, but neither Yale, nor Harvard ranks even in the top ten. (I suspect our friend “posterX” will dismiss this particular ranking as it applies to Yale!) Both Yale and Harvard are widely known to have outstanding English departments and it simply doesn’t make sense that neither would make even the top ten. The problem is in what is being measured and how it’s being measured.</p>
<p>Another example will be found here, where, in Philosophy, Princeton is in a tie for second and Harvard and Yale (again, both known for being powerhouses in this area) don’t make the top ten.</p>
<p><a href="http://chronicle.com/stats/productivity/page.php?primary=10&secondary=91&bycat=Go%5B/url%5D">http://chronicle.com/stats/productivity/page.php?primary=10&secondary=91&bycat=Go</a></p>
<p>Here is the article in the Chronicle that reviews the company’s work.</p>
<p><a href="http://chronicle.com/free/v53/i19/19a00801.htm%5B/url%5D">http://chronicle.com/free/v53/i19/19a00801.htm</a> = Chronicle article evaluating Academic Analytics</p>
<p>It would be very interesting to see the rankings that this company produces next year. I would bet, given their questionable data gathering techniques that the rankings might change significantly from one year to the next due in part to the unpredictable schedule of book publishing by faculty members.
The real “gold standard” in this kind of evaluation is done by the National Research Council. Unfortunately, their last study is from 1995 and the new one that was supposed to be out in 2005 has been delayed and may not appear until 2008. Here is a link to the 1995 report where you can see an evaluation of engineering programs that I believe most people would say makes more sense.</p>
<p><a href="http://www.grad.berkeley.edu/publications/pdf/nrc_rankings_1995.pdf%5B/url%5D">http://www.grad.berkeley.edu/publications/pdf/nrc_rankings_1995.pdf</a></p>
<p>In that study, for example, Berkeley ranks 4th in electrical engineering whereas it isn’t even in the top ten in the Academic Analytics study.</p>