PhD production - Economics

<p>Xiggi:</p>

<p>How are you defining "quality" in an undergrad Econ department?</p>

<p>To me, the most relevant definition of quality would center entirely on teaching expertise, motivating students, and developing a keen interest in the field.</p>

<p>You could have a very high quality department with one professor, depending on the number of students.</p>

<p>Interesteddad, despite being possible, it would be rather strenuous for a one person department to offer the depth and breadth necessary to accomodate a reasonable college. I know that small classes are a benefit of LAC, but I doubt that we want to return to High Middle Ages or Renaissance. </p>

<p>Regarding size, I'm wondering how Deep Springs would fare in your specialized rankings. They would need only a handful of PhD to get the gold medal in many fields. :)</p>

<p>PS Remove "quality" from my list of criteria, and my position does not change much. Do you really believe that the schools listed in 4,5, and 10 "known" have become economics juggernauts at the UG level?</p>

<p>Again, I don't understand the concept of an an "economics juggernaught at the undergrad level".</p>

<p>To me, there is only one criteria that matters when evaluating an undergrad department: how effectively do they teach undergrads? How well do their students understand analyzing, critical thinking, formulating ideas, and communicating? This is based on many factors, but the three biggies would be the quality of the students, skill of the teaching, and the degree of engagement by both.</p>

<p>From a numbers standpoint, size only really matters on a faculty to student ratio basis. A department with 100 professors and 2000 majors isn't "better" than one with 10 professors and 200 majors. It may offer more variety (just as a department store offers more variety than a boutique store), but we aren't talking vocational training with a goal of memorizing facts. IMO, an undergrad curriculum teaches a process.</p>

<p>I don't know a darn thing about the econ departments at Carleton, Oberlin, or Grinnell. Don't need to. It is clear from the data that they are doing a excellent job of preparing their econ majors for advanced graduate level work. They could have Bozo the Clown teaching the classes and whatever Bozo is doing is obviously working!</p>

<p>We're talking undergrad economics here. It's not like every Econ PhD in America doesn't understand the fundamental concepts and analysis techniques. We are only talking about 8 to 12 courses and a senior thesis over a four year period and half of those courses are standard entry-level courses taught at every college in the country. What matters is how well they are taught.</p>

<p>I think the "how's the XXX department" questions for choosing an undergrad school are mostly relevant the further the student's interests diverge from the bread 'n butter fields of Econ, Poli Sci, English, Psych, Bio, etc. There are less popular departments where the breadth of offerings may become an issue: languages, music performance, linguistics, classics, etc. But, I can't think of a college in the country that wouldn't offer 12 solid Econ courses.</p>

<p>I am uncertain why xiggi apparently selectively rejects the evidence. What's particularly interesting about this sort of analysis is that it doesn't start with the presumption (as the USNews has been documented as doing) that any rating system has to come out demonstrating that our predispositions are correct (which schools ought to come out on top). While at some level we do care about "face validity" of the system, we also should be open to the idea that we can discover things we didn't already know (or think we already knew) if we actually collect data and analyze them.</p>

<p>That the evidence reveals that many small schools are highly productive of future PhD's is an important fact about American education. (I think, interesteddad, that if you could calculate the proportion of PhD's that are produced at LAC's vs. all other schools -- adjusting for number of graduates -- you would show direct evidence of the above conclusion. It's a conclusion that the NSF/NAS itself reached some years ago from looking at this kind of data.)</p>

<p>The data are very appropriate empirical indicators of one form of productivity or effectiveness of undergraduate programs. There is virtually no other systematic indicator of such performance -- certainly no such "output" measures are incorporated into the USNews ratings (except for graduation rates and alumni giving, which tell you little about what is learned and especially not by academic discipline).</p>

<p>I think the PhD production rates vary by field. Several studies have shown that 4-year undergrad colleges produce science PhDs at nearly double their expected rates based on enrollment. However, they produce Engineering PhDs at about half their proportional rate (most undergrad colleges don't offer engineering).</p>

<p>From a quick look at baccalaureate degrees by institution type from '89 -'98 and PhD production from '94 - '03, it appears that undergrad colleges represent about 11% of the total PhDs and 11% of the enrollment. Thus, the tables I've posted seem to be more indicative of individual schools than a particular type of school. However, the data I'm using includes all of the lower-tier state college campuses without graduate programs (many of which were junior colleges not that long ago), so it's a mix of very good LACs and relatively weaker large 4-year colleges.</p>

<p>My guess would be that once you get outside of engineering or other substantial majors for which small colleges or undergraduate colleges typically don't offer degrees (this may depend in part on how broadly you define engineering, for example whether it includes computer science), you would see the disproportionality that I mentioned in almost all major fields (all the ones for which you have done your tabulations so far). But that's just my surmise.</p>

<p>"I am uncertain why xiggi apparently selectively rejects the evidence."</p>

<p>Mac, allow me to shed some light on this issue! First, I'm not sure who rejects the "evidence" and is apparently selective here. There is no "evidence" in the numbers presented by I-Dad, and the conclusions are at best dubious, mostly because of the introduction of unverifiable control numbers. I believe that I-Dad proposed the solution in his first ranking thread: "Some have complained that these lists don't provide useful data. Proposed Solution: ignore the lists". I have no problem in abiding by this advice for the generic, but the discusions related to the economics department, and especially the conclusions, do deserve questioning. For instance, a glaring limitation is that the path from undergraduate to a PhD is not necessarily a direct one. Doesn't the school where a Master's degree is earned carry any importance? </p>

<p>Part of this discussion started in a thread in the College Selection forum. In that thread, Interesteddad seemed to take great exception to a ranking of exonomics department among LAC. In his own words, the rankings were as bogus as hiw own PhD rankings. The problem with his reasoning is the latter statement is correct, but the first one blatantly contradicts a number of recognized studies. In that same thread, I invited I-Dad to check the works of Tom Coup</p>

<p>Xiggi, interesteddad is hardly the first to tabulate the figures that he has tabulated. He didn't select this indicator randomly or just selectively to support a position. He seems to me just to be looking for something hard on which to make systematic assessments of the overall intellectual climate (and the microclimates of instruction in different fields) across colleges and universities in the U.S.</p>

<p>Indicators of the quality of undergrad instruction that are based on assessments of the reputations of faculty (e.g., using USNews or the NAS -- National Academy of Science -- rankings) are misleading because they don't say anything specific to undegrad training. USNews' use of class size or teacher-student ratios are highly suspect for a variety of reasons (a lot of their published numbers just aren't believable, for one thing). But an alternative to just throwing up one's hands is at least to try to see what one can learn from relevant available, even if imperfect, systematic data. For that interesteddad's efforts should be thanked, not dismissed out of hand as you seem to do.</p>

<p>He also doesn't say that the quality of education in economics (or any other field) "has to be measured" (this "mandatory" aspect is your misreading of what he says) by the number of terminal degrees that are generated. What he does say is that the number of PhD's is probably a reasonable proxy for the quality of undergraduate training and the overall intellectual atmosphere of a college. </p>

<p>To my knowedge, there is no other accepted indicator of the quality of undergrad training by discipline across many disciplines, or, for that matter, the quality of undergraduate education in general. There are some far weaker approximations of quality, which I referred to, namely alumni satisfaction and graduation rates. Also, with respect to specific disciplines or fields, there are some "reputational" (subjective) rankings, i.e., where "experts" in the field rank undergrad programs in areas such as business and architecture. This is also how USNews ranks most doctoral programs -- by reputation (I've been one of their "respondents" to such rankings so I know how they do them from observation). The colleges often gather data as well on placement rates of their students -- employment upon graduation or admission to graduate and professional schools. But unfortunatley there is no unified collection of such data. This is one reason why we do our best (and it's not as good as we'd like) by trying to use things like PhD production (which the NSF and NAS monitor on a regular basis), as well as a variety of "reputational" data.</p>

<p>It would be desirable to have several objective measures of the quality of undergrad education overall and by field/discipline. Most that I could think of -- including placement and professional school admission rates -- wouldn't tell you the true value added of an undergrad program. That's true also of the PhD statistics. Swat, for example, admits an initially outstanding pool of students who are committed to a demanding curriculum. Does does it "add value" to those students, or would those students do just as well attending NE State U? I happen to think Swat does add value, and that the relative strengths of its programs are reflected in the PhD productivity figures, but there's no systematic way to prove this from the available data.</p>

<p>
[quote]
I-Dad comment that, "It has about as much bearing on the quality of undergrad teaching as the number of ice cream flavors sold in the university food court."

[/quote]
</p>

<p>Yes. I stand by that assessment of any "ranking" system that uses published pages by Economics professors in a department as a qualitative measure of undergrad education.</p>

<p>If you talk to college professors, the oldest debate in the book is publish versus teach. Because there are only so many hours in the day, it is a zero-sum game. The trade-off between pressure to publish (and/or maximize research revenues for the university) and pressure to excel in undergrad teaching is inherent in the tenure guidelines at every school in America.</p>

<p>Evaluating the priorities set by any college or university for its faculty should, IMO, be a key component in deciding what type of school to choose. The reason that LACs tend to have such high per capita PhD production rates is that something is happening in the teaching style at those schools to fully engage higher than average numbers of students and give them the kind of mentoring/research/analysis opportunities to make them good graduate school candidates. </p>

<p>The same high per capita rates appear to hold true for professional schools (Law, Med, Biz) as for PhDs. However, there is not the systematic tracking for 85 years as there has been in the NSF PhD database. If I'm reading the stats correctly for Swarthmore, something like 18% of Swat grads end up going to med school and another 13% go to Law School. Those are huge, huge numbers.</p>

<p>BTW, the Masters school isn't necessarily relevant. In many cases, the PhD. program is a standalone track. In other words, you graduate from college, maybe do a couple of years as a research assistant (or not) and go directly into a PhD. program. For example, three of Swat's eight Chem majors graduating last year went directly into PhD programs: one each at Yale, Berkeley, and Columbia.</p>

<p>In other fields (social work, education, engineering, biz), a Master's degree is more common terminal degree.</p>

<p>" For that interesteddad's efforts should be thanked, not dismissed out of hand as you seem to do."</p>

<p>Mackinaw, I am afraid to have to disagree here, as well as with your comments about I-Dad NOT trying to measure the relative value of the economics departments. Producing a ranking list IS a measurement. I have no problem whatsoever with the exercise of counting the number of PhD produced by Swarthmore -or the other listed schools-, or the with the control factor of establishing a ranking by 1000 students. I would be glad to thank I-dad for the effort of segregating that data from the giant databank of NSF. However, the problems I have starts with the reaching of spurious conclusions from the same data. As I have said before, the ONLY conclusion that can be reached from the data is ... that there are some differences in the number of PhD among schools. Nothing MORE!</p>

<p>Again, allow me to repeat that the exercise is absolutely irrelevant to the world of economics at the undergraduate level, because the number of annual PhD in Economics represents a VERY small fraction of the population of students in Economics. Have you really checked the landscape for admissions at PhD programs? We are talking about 200 ANNUAL admissions at leading programs. In the simplest terms, there are many attributes that define a successful program in Economics, and the production of PhD ought to be one the least relevant. </p>

<p>As far as rankings, I am on the record for stating that I do not believe that in the existence of any truly meaningful rankings for measuring the quality of schools. Some rankings -a la US News- provide a valuable insight for admissions' purposes, but this value is restricted to quantifiable and measurable elements such as selectivity. The bulk of the report and especially the final ranking is a complete joke. </p>

<p>However, if you insist to find rankings of higher education in Economics, there ARE accepatble and recognized standards. Like it or not, the standards are based on the number of citations and quality pages. The resulting rankings DO seem to confirm the "overall" reputation of schools. Those rankings are prepared by economists and, with subtle variances, seem to be pretty consistent. While they might not apply to the quality of undergraduate institutions, I'd rather adopt conclusions reached on similar methodologies than a set of conclusions that lack relevance to the field of economics. </p>

<p>Lastly, I'd like to comment on another issue. While there is ample evidence that the LAC do contribute solidly to the number of PhD -as I-dad has pointed out- I would still question if a LAC is the most adequate path to a LEADING PhD in Economics IF that is the intent of a student. There is data that indicates that the admission in leading Econ PhD department is highly influenced by the UG. In other words, a school like Harvard seems to favor Harvard undergraduates. Similar situations appear to show up at Chicago, Yale, etc. </p>

<p>However, this is NOT THAT relevant for the overall landscape of economics at the undergraduate level. It only applies to students who have decided to earn a terminal degree in the field, and, again, this represents only a fraction of the students.</p>

<p>Xiggi, the number of citations to faculty research tells you nothing about the quality of undergraduate instruction at an institution. It tells you about the quality (and peer reputations) of the faculty's research.</p>

<p>The U of C is generally recognized as one of the leading Economics depts. in the country with over 20 Nobel prizes. It is also one of the largest PhD producers (still small absolute numbers). My problem with the LACs is that of the 'Big 3', they only have one Chicago graduate on their faculty (last time I checked Williams had one graduate, Amherst and Swat none). Do these 'hard numbers' mean a bias in their academic leaning? Xiggi has a point...quality sometimes is hard to quantify.</p>

<p>wsox, the best graduate departments are producing excellent researchers, very few of whom want to devote large fractions of their time to undergraduate teaching. One thing that LAC's look for in prospective faculty is devotion to teaching. In my experience (in a different social science discipline), when LAC's hire "Ivy" or graduates from the leading research departments (which may not be Ivy) they do not get the best of the new PhD crop from those departments -- that is, those with the most promise as creative researchers. They may get graduates who were the second or third tier students from those departments, who were unable to get a position on a leading graduate/research faculty.</p>

<p>Swat has none. Their PhD origins for the current Econ profs (not counting visiting profs) are:</p>

<p>Yale 3
Stanford 2
MIT 2
U Maryland 2
U Virginia 1
Harvard 1</p>

<p>Macinaw:</p>

<p>No, I'm not going to tabulate the data! But, just from a casual look at Swat's professors, there seems to be a high percentage who did their undergrad work at a top LAC, followed by a PhD at the usual suspects. </p>

<p>My guess is that many of these actually wanted to teach (I know, it's shocking!) and knew, from personal experience, that a tenured gig at a place like Swarthmore can be a wonderful life.</p>

<p>Lots of Williams, Amherst, and Swarthmore grads on each others' faculties.</p>

<p>Makes sense to me. As I said, LAC's definitely try to recruit faculty who really want to teach. They also want faculty who understand the general culture of liberal arts colleges, and what better evidence of that than that a recruit has a degree from an LAC?</p>

<p>Mack, please read whatI wrote in the post just above your comment, "Xiggi, the number of citations to faculty research tells you nothing about the quality of undergraduate instruction at an institution. It tells you about the quality (and peer reputations) of the faculty's research."</p>

<p>This is what I wrote:</p>

<p>"As far as rankings, I am on the record for stating that I do not believe in the existence of any truly meaningful rankings for measuring the quality of schools."</p>

<p>I am in perfect agreement with your statement, "It tells you about the quality (and peer reputations) of the faculty's research." </p>

<p>Now, take a look at the relation between reputation and ranking at the graduate schools. :)</p>

<p>You may be interested in the following paper on the issue of "The Doctoral Origins of Economics Faculty and the Education of New Economics Doctorates" It is from the Journal of Economic Education, Winter 1999.: </p>

<p><a href="http://www.indiana.edu/%7Eeconed/pdffiles/winter99/Pieper.pdf%5B/url%5D"&gt;http://www.indiana.edu/~econed/pdffiles/winter99/Pieper.pdf&lt;/a&gt;&lt;/p>

<p>FWIW, the information on the following site is quite interesting. There are a few similar sites, but since they cross link one another extensively, it is easy to follow the path. For instance, the above quotation is also on this site:</p>

<p><a href="http://www.econphd.net/guide.htm%5B/url%5D"&gt;http://www.econphd.net/guide.htm&lt;/a&gt;&lt;/p>

<p>Here are a couple of references for you. If you have JSTOR access you can download them. They both speak to the relation between citation counts (as opposed to publication counts) and faculty reputation.</p>

<p>Robert Jackman and R. Siverson, "Rating the Rating: An Analysis of the National Research Council's Appraisal of Political Science Ph.D. Programs," PS June 1996: 155-60.</p>

<p>Robert Lowry and B. Silver, "A Rising Tide Lifts All Boats: Political Science Department Reputation and the Reputation of the University," PS June 1996: 161-67.</p>