- You assume that rankings are something more than a vehicle to sell magazines/ads from the view of the rankers.
- For PhD production, note the denominator. Most state schools (and also some privates) have all sorts of majors that generally don’t have students aiming for S&E PhDs. So if you’re talking about per capita, the relevant denominator should be undergrads in those fields. And even then, you still have to normalize by entering student quality as well as student aims (how many are aiming for a PhD vs. working after graduation). When you are talking about per capita, that can be a big difference.
- Not sure why you brought in LACs when the academics were asked to rank RUs.
- Cal gets about 30K OOS+International apps these days (http://admissions.berkeley.edu/studentprofile), so it does seem as if they do attract OOC applicants.
Some figures:
Cal engineering:3126 undergrads (http://engineering.berkeley.edu/about/facts-and-figures)
Cornell engineering: 3051 undergrads (https://en.wikipedia.org/wiki/Cornell_University_College_of_Engineering)
MIT engineering: 2447 undergrads (http://web.mit.edu/facts/enrollment.html)
Total number of S&E PhD’s produced (1997-2006; http://www.nsf.gov/statistics/infbrief/nsf08311/):
Cal: 3199
Cornell: 2536
MIT (note that MIT has the 2nd highest S&E PhD rate among RU’s and 3rd overall, after PhD-producing machines CalTech and Mudd): 1867
OK, granted, ideally, the numerator would be engineering PhDs, not S&E PhDs (or the denominator be S&E majors), but the ratios as they are now:
Cal: 3199/3126
Cornell: 2536/3051
MIT: 1867/2447
Another key issue that has often been ignored: per capita S&E PhD production does not take into account the quality of the PhD programs. For example, while Swarthmore has been consistently among the very top producers of PhDs (per capita) in the country, it has not been as successful in sending its graduates to top PhD programs (at least not in math per their own documentation: http://www.swarthmore.edu/Documents/academics/math/grad_GRE/MathGradSchool.pdf). I have observed similar situation in the PhD admission of my own department where the majority of domestic students come from top research universities, with much fewer admits from top LACs or other colleges.
Caltech, MIT, and Harvey Mudd all have much higher percentages of engineering majors than Berkeley, but also rank much higher than Berkeley in per capita PhD production. NM Mines has a much higher percentage of engineering majors than Berkeley, is less selective than Berkeley, but still ranks higher than Berkeley in per capita PhD production.
I’ve tried normalizing the top 50 S&E PhD production numbers by SAT-M scores. I found that schools that dominate the top are LACs (including less selective CTCL schools) and technical institutes (including less selective schools like NM Mines). http://talk.collegeconfidential.com/discussion/comment/17572046#Comment_17572046 , post #7.
Berkeley winds up about where it does in the un-normalized per capita S&E PhD rankings: 40-something out of 50.
The PhD production data measures completions, not admissions. If we had more data on how many students from each school wind up in the strongest programs, maybe it would put Berkeley higher in the list. Does anyone have those numbers? I haven’t found them.
What other evidence suggests that the PA scores are closer to the truth than the USNWR “hard data” (on admission selectivity, etc.)?
This discussion of graduate school stats probably deserves its own thread. (Not trying to “police the thread,” just making a suggestion - I think it’s a great topic, just not especially relevant for most students at most schools.)
@tk21769, did you read my post?
So what denominator are you using to determine per capita PhD rates?
Because if you use the number of engineering undergrads as the denominator, Cal does very well (better than MIT).
Interesting thread, and I want to throw out another idea:
Within the Top 50, undergraduate curriculum have been pretty standardized, particularly in STEM. Why do we assume that the quality of education is so much different between them?
If the difference in quality doesn’t strongly affect their learning, then other factors will be in play when you look at postgrad results. Student quality / stats, if they truly are indicators of ability (entire question there alone, for now let’s take it at face value) are now more indicative of success than the school, thus correlating high stats to high postgrad results. Then there’s the job market, where schools that focus on it will come out with higher results somewhat independently from the academic quality of the school.
If we’re talking from a pure academic quality standpoint, we really don’t have many metrics for it. We have metrics for students, peer opinions, and faculty research. None of those are directly tied to the ability and effectiveness of a professor’s teaching. In fact, many teachers highly involved in research could be neglecting their teaching as a result.
I think all of this leads down to a central question: what are the purposes of these rankings? Are they to find the best college to get a job after? Are they the ones best for learning? A combination of both? The important takeaway, IMO, is that we should be more clear about what exactly we are ranking, and adjust the factors we consider accordingly.
For example, I don’t think that employability will be very tied to faculty research results. Yet it is considered in many rankings. For academic quality, perhaps a bit more, but it is still not even directly tied to teaching IMO. But, does this even matter with such standard curriculum?
I think this is a useful rating system. The evidence is that all good rating systems generate a lively discussion on how Berkeley is underrated.
@PengsPhils I think that you raise some really interesting points about the purpose of these sorts of lists.
For me, the real value is the ability to look at a large group of colleges and evaluate them roughly in terms of various criteria. Let’s face it, despite claims of expertise by all sorts of people, there are very few people who have anything more than a passing familiarity with even a dozen different colleges, let alone the whole breadth and scope of the US college scene.
“Per capita” in the context of PhD production numbers, absent any other qualifier, generally refers to the total population of undergraduates. Here is the source I used:
http://www.nsf.gov/statistics/infbrief/nsf13323/
When I stated that UCB ranked “40-something” I was referring to Table 4.
Now, if we’re only talking about RUs, its rank is much higher.
In that table, it is 18th after removing all the LACs above it.
So its S&E PhD production rank is pretty close to its overall USNWR “national universities” rank …
but yes, it’s true that most of the research universities above it are more selective.
@tk21769, however, if you are trying to compare by a PhD rate that makes sense, the appropriate denominator should be the number of STEM undergrads, no? After all, I don’t see any communications majors going on to get a S or E PhD. And the percentage of STEM undergrads at an institution can differ by a lot.
I think this list shows pretty nicely one problem in constructing rankings, which is that factors that might mean a lot in one tier of the ranking don’t mean much in another tier. In this one, it seems to me that retention and graduation rates don’t mean much at all in the top third or half of this ranking. I mean, I’m a Yale, but I don’t think Yale’s retention rate says that it’s better in any meaningful sense than Harvard. That factor may tell you more, though, when you start looking at big state universities, even really good ones.
It’s my personal opinion that a list based strictly on the SAT scores of matriculating students would be good enough for any general ranking of school quality. It’s a market-based lagging indicator.
“It’s my personal opinion that a list based strictly on the SAT scores of matriculating students would be good enough”
I’m sure you’re aware of the compilation @Hunt, but others may find it handy:
Business Insider / The 600 Smartest Colleges in America
Yep. In general, I preferring tiering as oppose to ranking (like I do here: http://talk.qa.collegeconfidential.com/college-search-selection/1682986-ivy-equivalents-p3.html) because rankings may make it artificially seem like there is a big difference when there isn’t. For example, RU #16 is twice as far from the top as RU #8. Yet I put the USN RU’s #15 & #16 (Cornell & Brown) on the same tier as USN RU’s #8 & #9 (Duke & UPenn) because in terms of opportunities and prestige, they’re pretty much all the same.
Even in my tiers, I note that each level is only a half-tier apart, and when considering schools that are within a half-tier of each other, assuming that the cost is the same, fit and school-specific opportunities should be the overriding factors.
So for instance, near-Ivies UMich & ND are within a half-tier of both Ivy-level Cornell & Northwestern and “good schools” CWRU & UW-Madison, so when deciding between UMich & ND and those other schools, fit and school-specific stuff (and cost) should matter most.
But Cornell & NU are distinctly a tier above CWRU and UW-Madison.
The only thing that matters for parents and students regarding rankings is “where do I fit in.”
No matter where a school is ranked in the top 25 you are dealing with at least a 95th percentile student body on average.
In the LAC ranking from #1 to #25 the average ACT range is the 98th to 95th percentile.
It would be great if rankings highlighted this.
^^^^ I agree, and that’s kind of why I wanted to do these lists, with the actual computed “rating” displayed in addition to the rank.
This is kind of a cheap trick, and not super accurate, but I meant to display the lists like this, which I think is somewhat more informative:
1 98.2 Yale University 98.9% 99% 96% 3
2 98.0 Harvard University 99.0% 97% 97% 2
3 97.9 Princeton University 98.8% 98% 96% 1
4 97.4 Dartmouth College 97.9% 98% 96% 11
5 97.4 Stanford University 98.3% 98% 95% 4
6 97.4 University of Pennsylvania 97.7% 98% 96% 8
7 97.2 Massachusetts Institute of Technology 98.9% 98% 93% 7
8 97.2 University of Chicago 98.9% 99% 92% 4
9 96.9 Columbia University in the City of New York 98.3% 96% 95% 4
10 96.9 University of Notre Dame 97.8% 97% 95% 16
11 96.8 Duke University 97.7% 97% 95% 8
12 96.7 California Institute of Technology 99.4% 96% 92% 10
13 96.7 Washington University in St Louis 98.4% 96% 94% 14
14 96.6 Brown University 97.1% 97% 95% 16
15 96.3 Northwestern University 98.0% 96% 93% 13
16 96.2 Vanderbilt University 98.3% 96% 92% 16
17 95.9 Rice University 97.9% 96% 92% 19
18 95.9 Tufts University 97.3% 97% 92% 27
19 95.7 Johns Hopkins University 96.9% 97% 92% 12
20 95.7 Cornell University 96.9% 96% 93% 15
21 95.3 Georgetown University 96.1% 96% 93% 21
22 94.9 University of Virginia-Main Campus 94.4% 98% 93% 23
23 94.5 University of Southern California 95.5% 97% 90% 25
24 94.4 Boston College 95.2% 95% 92% 31
25 94.0 Carnegie Mellon University 97.1% 95% 87% 25
26 93.9 University of Michigan-Ann Arbor 93.8% 97% 91% 29
27 93.9 University of California-Berkeley 94.2% 96% 91% 20
28 93.6 Emory University 94.8% 95% 90% 21
29 93.5 College of William and Mary 94.0% 96% 90% 33
30 93.1 Brandeis University 93.8% 95% 90% 35
31 92.5 University of California-Los Angeles 90.9% 96% 92% 23
32 92.2 University of Rochester 93.9% 96% 85% 33
33 92.1 University of North Carolina at Chapel Hill 91.2% 97% 89% 30
34 91.8 Rensselaer Polytechnic Institute 94.5% 94% 84% 42
35 91.7 Wake Forest University 92.9% 94% 87% 27
36 91.7 Northeastern University 95.8% 96% 79% 42
37 91.4 New York University 94.4% 92% 85% 32
38 91.4 Lehigh University 92.3% 93% 88% 40
39 90.2 Georgia Institute of Technology-Main Campus 93.5% 95% 79% 35
40 90.0 Case Western Reserve University 95.0% 92% 78% 38
41 89.8 University of Maryland-College Park 91.6% 94% 82% 62
42 89.5 University of Illinois at Urbana-Champaign 90.0% 94% 84% 42
43 89.5 University of Florida 88.4% 96% 85% 48
44 89.3 University of California-San Diego 88.6% 94% 86% 37
45 89.2 University of Miami 92.3% 91% 81% 48
46 88.6 Boston University 89.3% 92% 84% 42
47 88.5 George Washington University 91.0% 92% 80% 54
48 88.2 Ohio State University-Main Campus 89.4% 92% 82% 54
49 88.1 University of Wisconsin-Madison 87.7% 95% 82% 47
50 87.8 Southern Methodist University 90.6% 90% 80% 58
It comes across better when its highlighted like that.
It’s an old Tufte trick…simple, but effective.
“I think this is a useful rating system. The evidence is that all good rating systems generate a lively discussion on how Berkeley is underrated.”
… Which just pushes Berkeley up on your prestigiosity scale, doesn’t it? Or have we determined what happens to a school’s prestigiosity when “defenders” are SO VERY EAGER to ensure that there is not a single person on College Confidential who might not bow to their school?
Though I’d say Michigan could give Berkeley a run for their money on that one, and Duke might not be far behind.
Bump (for whatever reason, lots of lists this time of year)