Best Placement at Top JD/MBA/MD Programs?

<p>Since you are very concerned about bias, could you explain your very obvious bias against Chicago and tell us why that bias shouldn't be taken into consideration when judging your efforts?</p>

<p>(And no, I'm not trolling for Chicago, I just find your recent posting history regarding Chicago fascinating.)</p>

<p>After going through some of my posts on the school, I don't see any "obvious bias." </p>

<p>So, in short, no. Its performance (which isn't bad in my study FYI) can be explained by the fact that a wider diversity of top professional schools were used than with the WSJ study (which chose both Chicago's law and business schools while overlooking similarly ranked institutions and gave a significant bump to midwestern schools in general). The data is the data. I would point out though that when only the top 10 (as opposed to 15) law, business, and medical schools were used Chicago was ranked 18 (by number), which was the same place it ranked for the WSJ study. It was only after I included 5 more schools that it dropped.</p>

<p>david:</p>

<p>one other possible bias is economic. Well over half of the NE private school attendees are full pay. Thus they have wealthy families which may be able to better afford a private grad school (in comparison to a UVa grad who might be happy to attend the instate med school, for example).</p>

<p>Actually, students at top state schools like UVA tend to be as well off as students at top privates (statistically). Also, for law or business school accepting scholarships for weaker schools is really not a smart idea because it will have a serious negative impact on your post-graduation opportunities (especially in law if you are looking at top firms). Taking the debt is usually definitely worth it for graduate school while that certainly is not the case for undergrad.</p>

<p><em>claps</em> ooo pretty numbers! Very nice data.</p>

<p>If you don't include publics, the rankings are almost the same except Dartmouth sinks a bit....</p>

<p>what about caltech??? none of your data has it</p>

<p>Any way you could split the results by profession. I would love to see undergrad placement specifically into medical school.</p>

<p>The last time you posted this exact same thread, davida1, I said the exact same thing.</p>

<p>Your ranking is absolutely worthless. </p>

<p>To begin with, it makes absolutely no sense to rank by absolute numbers. If Penn is twice as large as Dartmouth but sends the same number of students to grad schools, that doesn't show that the schools are doing equally well; it shows that relatively Dartmouth is doing twice as well. If you're worried about how Cornell's school of Hotel Management affects the numbers, figure out the relative %age of Cornell students attending the top grad schools assuming none are coming out of that school. The numbers will now favor schools like Cornell (because obviously some people out of the Hotel Management school do go on to top grad schools), but that is clearly preferable to numbers that reflect size more than quality of an undergraduate institution.</p>

<p>Now, we are left with serious problems with your data collection mechanism. Facebook itself is self-selecting, and facebook groups are even more so. It's clear that only a fraction of those actually attending the schools you're measuring join the facebook group for the school; Yale, for example, has 30% fewer students signed up for facebook groups at the 45 schools you're measuring (which include all of the WSJ's schools) than at the 15 schools measured by the WSJ. However, Cornell has a 10% increase. Does this indicate that relatively more students at Cornell go to the top grad schools, or does it indicate that relatively more students at Cornell join facebook groups for professional schools? Alternatively, it is possible that Yale students are more likely to join facebook groups than Cornell students, and in fact, Cornell's increase should be 100% and Yale's decrease should be 60%. The fact is, we don't know and can't make an educated guess about anything based from your numbers. </p>

<p>In slightly more scientific terms, your numbers fail to be meaningful because of selection bias; you are not measuring a random group of students accepted at top schools, but rather, a group of students who have elected to join facebook groups for top schools. </p>

<p>Now, as I've said earlier, your problem is fairly easily adjusted for. We have the data points we need to approximate what the facebook self-selection bias may be in a given school, and to adjust for it. If we find the number of students in facebook groups for the exact same selection of schools as WSJ uses and compare those numbers, undergraduate school by undergraduate school, we can approximate the selection bias for facebook groups on a school-by-school basis. Then, it will be easy to apply that bias to the per-capita numbers that you have found. That should yield a result far more indicative of reality. </p>

<p>For example, if there are 100 students from Berkeley in facebook groups for the WSJ's selection of feeder schools, but WSJ lists 200 students attending these schools, then we can assume that the Berkeley facebook selection factor is approximately 50%. Thus, if there are 280 Berkeley students enrolled in your selection of feeder schools, we can assume that this represents roughly half of the total Berkeley students at these grad schools leaving us with an adjusted number of 560 Berkeley students at grad schools. </p>

<p>Davida1, if--as you say--it was easy to collect this information, it should not be difficult for you to make these adjustments. If--as you say--you are motivated by a desire to rank feeder schools accurately (rather than, say, a bias towards one school in particular), I am sure that you will make these adjustments. Without performing these calculations, your data is useless even for making the most basic of assumptions.</p>

<p>Penn's class size may be large but the large pre-professional population should be heavily discounted. </p>

<p>The College has ~6500 undergrads. These would be the ones going into med school, law school, b-school. There is some overlap from other schools (SEAS bioengineers to med school, Wharton LGST concentrators to law school, etc) but largely it should be Penn's CAS population in consideration</p>

<p>You can't judge "best placement" in these fields unless you take the differing student bodies into account. For example, Georgetown's student body is generally known to be far more preprofessional than Uchicago's. Therefore, Gtown doing better than Chicago in the number of students it sends to top law/med/mba schools is not unsurprising. But that doesn't necessarily mean that a Georgetown degree is better than a Uchicago degree as the term "best placement" implies because fewer Chicago kids aimed for those professional schools in the first place.</p>

<p>Similarly, Reed college is ranked at the top for sending kids to phd programs but doesn't surface in your data... does that mean that they can't send kids to law/business/med school or does it mean that the students there simply aren't interested in going to professional schools?</p>

<p>It would be better to compare how popular preprofessional programs are at these colleges before comparing them. But I wouldn't know the means of obtaining that data.</p>

<p>tealover, Caltech does not have the strongest placement numbers at professional schools. </p>

<p>Here is the data for Caltech: </p>

<p>Percentage: 2.2%
Number: 5</p>

<p>abl doesn't understand the fact that a study can be imperfect and still indicative and insightful. WSJ's study was also imperfect, but it's not like this data is easy to come by. Schools tend to not publicize this information, but I think that my study gets at information that normally would not be available to high school seniors.</p>

<p>abl also doesn't understand the fact that the numbers would not change in any meaningful way with the adjustments that he/she is pushing for. I see no evidence of your Yale/Cornell numbers being accurate.</p>

<p>I think that DunninLA got it right early on in this thread with his conclusion that the data presented is pretty irrelevant and about as insightful as saying that the colleges with the highest selectivity will likely have a higher number going on to the highest ranked Law/Med/MBA school. No surprise there as it is quality and quality out. The undergraduate college probably has very little to do with it. </p>

<p>I should also like to introduce two ideas:</p>

<ol>
<li> The data for MBA schools should be taken with a HUGE grain of salt. Students go back to business school after an average of 3-7 years in the workforce. At the top of every application is going to be the experience and the achievement in the real world. MBA adcomms are not looking at your grade in Film Studies from sophomore year and deciding if you have what it takes to join their incoming class. You need a good GPA and/or a good GMAT to show that you have some academic ability, but what really counts in the MBA admissions process is how you perform in the post-undergraduate world. This often has next to nothing to do with the undergraduate college you attended.</li>
</ol>

<p>This delayed matriculation to graduate business schools is also increasing at the Law Schools and even the Med Schools. While certainly far less than the grad business schools, the law and med schools seem to be realizing that the applied intellectual and maturity level of its applicants is far better understood and evaluated after some time out of the college environment. I would expect (hope!) that this trend will continue.</p>

<ol>
<li>There is a cultural effect for graduate school attendance. Some colleges are just wired for this and their culture “expects” many of their students to go to graduate schools (not just of law/med/business, but across many disciplines). This campus predisposition leads to outsized numbers coming from certain colleges and not from others where such a culture is less dominant or not existent.<br></li>
</ol>

<p>I would not automatically conclude from this that ABC College is better than XYZ University at grad school placement. To make a true determination, you need to understand who is applying and seeing admissions data from each college. The data that WSJ and Facebook provide may be some indication of which colleges are most actively represented at these grad schools, but they very well may not reflect the quality and number of applicants from a given undergraduate college. Matriculation patterns?-I’d accept that. Placement strength?-I think that is a poorly-informed guess.</p>

<p>Actually if there are certain schools that have particularly strong placement at MBA programs that would suggest (according to your line of reasoning) that they are producing alumni that are successful in the workplace because they are earning spots at top b-schools (and great work experience is more important than academic factors). Your conclusion doesn't follow.</p>

<p>To pretend like there is no significant relationship though between "matriculation patterns" and "placement strength" is laughable.</p>

<p>And schools with high selectivity do not always perform well (that is sort of the point of the WSJ study and my study, actually). US News selectivity rating, as a proxy for selectivity, does not equal strong professional school placement. WUSTL's selectivity ranking is very high relative to its performance as is the case with other schools like: Caltech, Emory, U Notre Dame, Tufts, Vanderbilt, Harvey Mudd, Haverford, Claremont McKenna, Bowdoin, Davidson, W&L, Carleton, Hamilton, and Carnegie Mellon U.</p>

<p>To lump all of those under-performing schools and use the excuse that they are just interested in pursuing other routes needs evidence to support it not just an assumption. You shouldn't just assume self-selection without some evidence to suggest that that is occurring. It seems more reasonable to consider these schools not doing as great of a job in getting students admitted to top professional schools. </p>

<p>There are places where people post their stats and their admissions decisions to professional schools (I won't point them out) but a quick screening shows that schools of equal selectivity do not in many cases give graduates the same chances of admission to top JD/MBA/MD programs given that they have the same stats.</p>

<p>davida1,
I think your perspective is na</p>

<p>I think it's also interesting how well Princeton does. Yes, the school is prestigious, but we have no professional schools, and many of our grads go straight into finance, making it even more impressive.</p>

<p>One thing the study doesn't and possibly can't correct for is institutional bias. For example, I go to U-Florida (MBA) and I know there are many people at this school who have the stats to go to top 15 professional schools but due a UF bias, they decide to stay in Gainesville for their entire academic career. Maybe they are going to med school and theit parents are alumni? For example, look at the MBA GMAT scores for UF- they encroach on top 15 territory. In fact, I'm not so sure that if a UF professional school grad intend to stay in the state, the value added of going to a school in the top 15 (especially at the lower end of the 15) would make a difference 5 or 10 years out of school.</p>

<p>I am not a statistics-oriented person. I want to know facts, and what I see here are lots of questions which make me question the validity of the numbers. Such as noted above, nothing has been done about taking into account the type of student body as in U v. LAC, etc. What I would really like to know if the percentages of students from each school who applied to these programs and were accepted. That would tell me a lot more than these. If school A had only 15 apply, and all 15 were accepted, that would mean more to me than school B had 500 apply, and 25 made it. B would beat out A in these numbers, but A would beat out B in my book.</p>

<p>"abl doesn't understand the fact that a study can be imperfect and still indicative and insightful. WSJ's study was also imperfect, but it's not like this data is easy to come by."</p>

<p>Yes, I would say that the WSJ's study is a perfect example of a study that is imperfect while being indicative and insightful. Yours, however, is far to imperfect to be indicative or insightful. A somewhat parallel situation would be if you ranked undergraduate college's selectivity based on how many students were in a "I got a 1600 on the SAT" facebook group. That so-called "study" of selectivity would obviously not be indicative or insightful of anything save perhaps the propensity for students to brag at a particular school.</p>

<p>"abl also doesn't understand the fact that the numbers would not change in any meaningful way with the adjustments that he/she is pushing for. I see no evidence of your Yale/Cornell numbers being accurate." </p>

<p>You're almost right. You don't understand whether the numbers would change in a meaningful way or not. It may be a fact that the numbers would change, and it might be a fact that they would not. You don't know. I don't know. This is why you need to adjust for your selection bias. Whether you end up with similar data or very different data, your data is meaningless before making that adjustment. Go back and read through your stats book--it should help make things more clear. </p>

<p>This "study" of yours wouldn't pass at any of the colleges on your list. I doubt it would pass at pretty much any college in the country, for that matter; it sure as heck wouldn't pass at the high school that I am currently teaching at (which is by no means an especially good high school). The fact is, your ranking is shoddy work. I don't know if you're a high school student, a college student, or even a college graduate, but you must understand that it's important to at least aim for accuracy, especially in a forum like these boards where high schoolers may be easily misled. </p>

<p>Your problem isn't so much that you've compiled a list of what you believe the top schools in the country are. The problem is that you have attached a modicum of science to what is not scientific. In fact, if I were to compile a list or ranking of what I believe the top 20 schools in the country were, it would look pretty similar to your per-capita adjusted list. However, I would not--as you have done--mislead anyone about the scientific validity of that list. </p>

<p>Look, I know you've invested a lot of time into this ranking. As I've said before, you can still salvage your results, get something useful out of this. Don't be stubborn--make the adjustments!! If it's a time issue, I am sure you can recruit other posters here to help you. If it's not, well, what is it?</p>