are md/phd admissions fair?

<p>Here's a website of researchers at Cornell (Pharmacology Dept). Check out the backgrounds of researchers at this top tier school of Pharmacology.</p>

<p>I only checked the first 7 or 8 researchers' Biography. Majority of them come from abroad. This is a good example of how experience weighs far more than the name of the school you graduate from. </p>

<p>Pharmacology</a> Faculty</p>

<p>To the OP, it sounds like your interests are perfect for an MD/PhD program. MD only admissions processes do tend to focus heavily on interests in clinical careers, and the people who interview tend to be clinical docs. They are also somewhat intimidated by scientists, particularly hard scientists, and they stereotype them as "lacking people skills" and "not interested in medicine"</p>

<p>For MD/PhD programs these stereotypes are unimportant, they know they are looking for researchers, so they do not worry about those with strong scientific interests.</p>

<p>For someone like you, who wants a science career, having an MD in addition to a PhD makes a lot of sense. It expands the fields in which you can do research, and gives you some way to justify your existence on the faculty if your funding dries up. </p>

<p>Fair is a tough concept. The MD and MD/PhD programs are looking for different things. Your background sounds perfect for the combined program. You would be in with many many other highly qualified students for an MD only program, with the disadvantage that your limited interest in clinical medicine will come through loud and clear in your interviews.</p>

<p>Your acquaintance with the PhD who got denied from MD programs: Probably could get admitted if willing to apply to lots of places. Should not try to convince them that he/she wants a career change to clinical medicine. Even if true, it could be tough to get the admissions committee to believe it. There are lots of MD/PhD people who are great scientists, but by personality and inclination, not very good physicians. So each medical school has some sort of quota, perhaps explicit, perhaps not, for how many such people they want. If you get classified into this group then you have to fit into the quota.</p>

<p>Now there are plenty of MD/PhD people who are terrific clinical docs, but most admissions committees find it difficult to predict who these will be. So they tend to treat all strong scientists as if they will be non-clinical scientists. </p>

<p>So, in your case, apply to MD/PhD programs, you are a perfect fit. You should also apply to pure MD programs at medical schools that are looking for scientifically oriented students. However, these places will be the most competitive in the country.</p>

<p>thanks afan, that's what i plan to do.</p>

<p>Pharmagal,</p>

<p>Your observations about the limited significance of medical school prestige also apply in academic medicine. </p>

<p>Most faculty don't know, and don't care, where their colleagues went to medical school. To the extent they notice their backgrounds, they care about where they trained. Even that fades in significance as one advances in a career. Now coming out of Harvard certainly helps with getting into the sorts of residencies that turn out lots of academic faculty, so, in that sense, it matters where one goes to medical school. However, once on faculty, it is academic productivity that leads to promotion, and medical school pedigree counts for nothing.</p>

<p>I have not noticed a higher failure rate among graduates of the most prestigious medical schools. They seem to be on a par with everyone else.</p>

<p>


Yup. That's the conclusion we are coming to around these parts. Want to take a stab at a list on which schools this would include for...I don't know...hmmmmm...how about a Texas girl about yay tall with a really strong research resume? I can guess a few. D is morphing from M.D. /PhD to possibly something like Duke or Lerner at Case. If she goes traditional med she will likely add a research year at schools like Southwestern and Baylor. This stuff is pretty confusing. Thank God interviews a strong area for her.</p>

<p>Some of the MD-only programs I'm familiar with that are immensely student research-focused, in no particular order:</p>

<p>Harvard: obvious reasons in terms of resources and type of students it attracts
Duke: one year for research built into curriculum
Lerner: fiver year program, one for research
Yale: thesis requirement, option for 5th year tuition free for research (half of class does so)
Stanford: ditto (half of class too)
Pitt: thesis, offers scholarship-funded certificate programs in clinical/bench research, one year each
Penn: semi-official fifth year option like Stanford and Yale, but not utilized by nearly as many students as the former two programs.
Mayo: very small, academic-geared program</p>

<p>In general, the USNews rankings give one a good rough idea about how much the school pushes its kids to do research, the ones I listed I think are the programs that in general consciously take it to another level (there are of course more schools that fall into this category). There are some major exceptions of course (Udub, for example, generally avoids applicants who want to pursue research, even though it is quite prominent as a research institution in its own right). By and large though, the top 25 or so 'research' schools are the ones that tend to look for research-oriented applicants and encourage students to continue research once enrolled.</p>

<p>Although it would probably be good to say at an MD interview that I would also like to spend a yr after graduation doing research, their next question would be, if you like research, then why didn't you apply MD/Phd? Then what would be a good answer to that question?</p>

<p>an honest one</p>

<p>i was asked this in several interviews. I said I wanted to do a lot of clinical work and did not like the 80/20 lab/clinical split that most MD/PhDs are shunted into. Also, given that many of the programs I applied to were very research-geared, I said that I thought the setup of the program was sufficient to satisfy my research interests.</p>

<p>BY300,</p>

<p>Why not answer honestly? Frankly, all to many folks here view interview questions like some sort of game show where there is a right answer. In truth, there is rarely a right answer. Rather, folks are often looking for the reasoning behind whatever answer the person has. In other words, it is the thinking that matters.</p>

<p>I remember well once when I was asked why I had done A before B. I said "you have to start somewhere..." and the questioner was very happy with the answer!</p>

<p>i am very happy to see that i got joined to such a beautiful community...which is very much interesting and information providing...</p>

<p>^^^ was totally off-topic, but welcome! haha and im flattered you think i'm beautiful!</p>

<p>about the whole interview Q and A thing...these people go through a lot of interviews every year...i feel like lying is going to put you're answer in one of a few possible cookie-cutter categories...honestly, go with honesty (good campaign slogan?)</p>

<p>
[quote]
I can speak for only Pharma R&D. Investigators in in Pharma are hired for the 'specialization' and the specific research background they bring to the organization rather than the name of the school they graduate from.

[/quote]
</p>

<p>It's not really so much about the brand name per se, but really about the social networking.</p>

<p>You say that you're familiar with Pharma hiring, yet so am I, as my father worked in the pharma industry for decades, for a company that shall remain unnamed (but is very large). What he observed is something endemic to most organizations: employees tended to bring in their friends. </p>

<p>Let me tell you a story. A few years ago, the company hired a chemist who had both completed his PhD and had been on the faculty at MIT. Then, over the next few years, the company began hiring many more people from MIT, including "coincidentally" many of that guy's former students and post-docs. Those people then brought in their friends from MIT, who then brought in their colleagues (for example, one guy brought in his wife who had also graduated from MIT), etc. etc. In a few short years, it got to the point that my father was jokingly calling the clique the "MIT Mafia". In fact, one of the 'Mafia members' once joked to my father that working at the company felt very much like being back in MIT grad school all over again, because so many of his school pals were with him. </p>

<p>Nor were they doing anything untoward. They were simply taking advantage of their social capital, which is perfectly legitimate. Those guys were simply highly 'entrepreneurial' when it came to looking for jobs for their old school friends. When a job opening came up, the company (as most companies do) would first circulate the job spec internally to solicit their existing employee base for recommended candidates, and the 'Mafia members' would pounce on those specs by providing names of their old colleagues. Even when a manager wasn't yet actively hiring, but was just thinking about it, these Mafia members would seed their minds by saying something like "I hear you may be looking to hire somebody who knows X, well I happen to know a guy who does X and who is looking for work, etc. etc." </p>

<p>Now, to be sure, this social recommendation system doesn't guarantee a job for everybody. But it at least gets them an interview. It at least lets the company know that you exist. You can't get a job if you don't even know that a job opening exists. The reality is that most private-sector jobs are never publicly posted and hence aren't available through a true 'open' competition, but are really only available through the knowledge you obtain through social networks. Not every member of the MIT Mafia got hired (in fact, some didn't), but at least they had the chance. </p>

<p>
[quote]
Your observations about the limited significance of medical school prestige also apply in academic medicine.</p>

<p>Most faculty don't know, and don't care, where their colleagues went to medical school. To the extent they notice their backgrounds, they care about where they trained. Even that fades in significance as one advances in a career. Now coming out of Harvard certainly helps with getting into the sorts of residencies that turn out lots of academic faculty, so, in that sense, it matters where one goes to medical school. However, once on faculty, it is academic productivity that leads to promotion, and medical school pedigree counts for nothing.

[/quote]
</p>

<p>I agree that academic productivity ultimately matters. But the problem is that status bleeds into productivity. In other words, the status of your institutional affiliation (unfortunately) affects your publication success. Sad but true. </p>

<p>Care to disagree? Former Harvard psychologist Robert Rosenthal, now at UCR, once remarked in his 1982 paper that when he was starting out as a junior faculty member at the University of North Dakota, he had 15-20 papers that he could not convince any journals to publish. Yet within a few years after he moved to Harvard, most of those papers were not only successfully published, but were published in the very same journals that had previously ignored them. Similarly Biggs 1990 and Garfield 1986a showed that papers submitted from authors from lower-status institutions are more likely to be rejected than a comparable control set of papers from higher-status institutions, which also echoes Peters & Ceci 1982 that I discussed previously. </p>

<p>But perhaps the most pervasive influence is that of 'invisible colleges', that is to say, not so much the reputation of the school at large that is at play, but rather the power of social network ties. This can be seen most clearly when looking at journals that are themselves published/edited at certain universities. Pfeffer, Leong & Strehl (1977) and Shamblin (1970) showed a clear and convincing positive correlation between the institution of the author and the institution of the editor in terms of publication success, after controlling for institution quality and size. For example, Shamblin showed that authors who were the most successful in publishing in the American Journal of Sociology (AJS), which is arguably the top sociology journal in the world, came from the University of Chicago. And where is the AJS published and edited? The University of Chicago. A similar finding also was demonstrated with the Journal of Political Economy, which just so also happened to be published at the University of Chicago. Or, to quote Willis & McNamee (1990) - p. 374:</p>

<p>"...the links between editors and authors represent spheres of influence that increase the probability of publication in the elite journals within the field, thereby contributing to the persistence of accumulated advantage over time."</p>

<p>Heck, I can even give you an example from my friends. I know a woman who is pursuing her PhD at Harvard, and had submitted one of her papers to a journal. One of the paper referees ended up being none other than her own boyfriend (now husband)! Unsurprisingly, her paper was accepted for publication with very little problem. </p>

<p>We would all like to think that the peer review process is truly unbiased. But that's not the truth. Social ties and social networking do matter. One of the 'games' you have to play in academia is developing social ties with gatekeepers, that is, the journal editors and referees, most of whom tend to be at the top schools, because you want to become a member of the 'invisible college'. As a case in point, that Harvard woman I mentioned above has now been offered a referee position at a top journal, and her husband is now not only a referee for several journals, but is also an editor for two of them. They're basically a "power couple" now, and hence, if you want to pursue an academic career, are great friends to have (but also very dangerous enemies).</p>

<p>Now, some of you might be thinking that I'm accusing academia of corruption. Not exactly. I don't think that social ties are necessarily a bad thing. After all, they can be used by editors and referees to deduce who really are the scholars who are likely to be producing top work that is likely to be highly cited. Social capital has its benefits. But the fact remains that you can't build social capital if you're not even around and hence if nobody knows who you are.</p>

<p>I believe you are mischaracterizing the findings in several dimensions. </p>

<p>First of all, much of this work found these effects outside of science, which was the topic of this discussion. Secondly, to the, limited, extent this effect was found in science at all, it is clear that it has diminished over time (Willis and MacNamee, 1990). Further, the cause of this observation is debatable, and many have maintained that it is due to common approaches to methods and interpretation, rather than simple favoritism toward members of the "club". It is particularly difficult to propose such favoritism in blind reviewed journals. Yes, sometimes it is clear who has written the paper, but most of the time it is not.</p>

<p>If her boyfriend reviewed her paper he was profoundly unethical. If the editor found out about it the journal should have retracted the paper and included an editorial explaining how the former reviewer- never to be used again by that or any other reputable journal- had corrupted the process. This would be hugely humiliating for all concerned. If the author had known her lover was refereeing her paper and went along with it then she was profoundly unsuited for an academic career. Certainly it would rule her out for being hired or promoted.</p>

<p>I have returned papers that I knew were written by people that put me in conflict of interest. Any honest person would do the same.</p>

<p>
[quote]
Secondly, to the, limited, extent this effect was found in science at all, it is clear that it has diminished over time (Willis and MacNamee, 1990)

[/quote]
</p>

<p>W&M only proves my point further. Sure, they may have found that the effect had diminished over time, but that it still exists. After all, who really cares about the effect over time? Sure, according to W&M, maybe the effects of the invisible college will have disappeared completely in the future. But so what? By that time, you're dead. Nobody has a time machine; you have to determine what is the best school for you right now. </p>

<p>
[quote]
First of all, much of this work found these effects outside of science,

[/quote]
</p>

<p>Then consider the studies that were found within the realm of science. The biased impact of prestige on scientific peer-review has been a problem basically ever since peer review was instituted. Let me give you some famous examples. </p>

<p>Consider what happened to the paper "An Experiment to show that a Divided Electric Current may be greater in both Branches than in the Mains", written by Lord Rayleigh - yes, THAT Lord Rayleigh who was one of the most highly regarded scientists of the late 1800's and who, among numerous discoveries, eventually won the Nobel Prize in Physics for his pioneering work on the physical properties of gases, including the discovery of argon. According to his son and biographer, Lord Rayleigh's name was accidentally omitted from the paper whereupon the Committee of the British Association promptly rejected it. "However, when the authorship was discovered, the paper was found to have merits after all." (Strutt 1924, Barber 1961)</p>

<p>Consider what happened to one of the mathematical papers of a young Neils Henrik Abel that made important advances of one of the classical problems on 5th degree equations. Abel would later become one of the most prominent mathematicians of his day, discovering such concepts as Abel's Theorem, Abel's transformation, and the abelian group, yet at that particular time not only was he unknown, but he came from a country (Norway, which at the time was part of the Kingdom of Denmark) that had little standing in the world of mathematics, most of the best mathematicians being located in France, one of the German principalities, the Netherlands, or the UK. Abel sent his paper to many of the leading mathematicians of the time, including the (at the time) world-famous Carl Gauss, but Gauss never even bothered to read it and simply filed it away unopened, where it was later found after Gauss' death. Of course, upon its rediscovery, the paper was found to be a highly important contribution to number theory. (Ore 1957) </p>

<p>But at least Rayleigh and Abel did achieve recognition during their time. Much sadder stories can be told about scientists whose works were marginalized due to lack of institutional prestige until long after they were dead. A classic example is the now-world-famous 1865 paper "Experiments on Plant Hybridization" by Gregor Mendel. We know of it now - in fact, we all learn of it in high school biology - but during his time, Mendel's work was widely discredited because he was not a faculty member of a prestigious university or, heck, any university at all, but was 'merely' a monk at an obscure abbey, and hence his work was widely discredited as being that of "an insignificant provincial" by such luminaries of biology/botany at the time as W.O. Focke, von Marilaun, and Hoffman (Roberts 1929). In fact, Mendel conducted a correspondence with the prominent and famous German Swiss botanist Karl von Nageli who was harshly critical to Mendel's ideas and even suggested that he change his experiments from peas (for which his genetic theories were unusually suitable) to hawkweed (which was highly unsuitable), which caused Mendel to became so discouraged that he eventually abandoned his genetic research altogether to become a full-time abbott( Beveridge 1959, Krumbiegel 1957). Mendel died in complete obscurity, and his groundbreaking paper was rediscovered only in 1900, 16 years after his death and 35 years after it was first presented.</p>

<p>Or consider the seminal work on continental drift by Alfred Wegener. Wegener first presented his ideas in 1912, but he suffered greatly from the problems that, first, he did not hold a degree in geology (his PhD was in astronomy/meteorology), and even worse, he did not hold a fully-fledged professorship (his position at the time was merely a 'tutor'). Hence, his ideas were so widely disbelieved that the American Association of Petroleum Geologists held a conference specifically to discredit them, and the reaction from the geology community was near-universal dismissal, so much so, that Wegener stopped publishing work on geology and went back to meteorological research, eventually dying on a meteorological expedition to Greenland. Only 20-30 years later was Wegener's papers rediscovered and found to have significant merit, so much so that his theories are now accepted as the current paradigm within modern geophysics. (Oreskes 2004) </p>

<p>Or consider the sad story of Nicolae Paulescu, who is now seen as the first true discoverer of insulin, as opposed to Frederick Banting and JJ MacLeod who won the Nobel Prize for their "discovery". Paulescu not only wrote 4 papers about his discovery, but also patented it, and Banting and MacLeod's Prize-winning paper merely confirmed Paulescu's findings and even cited them. Paulescu's work was nevertheless completely ignored by both the scientific community and the Nobel committee, and his pioneering work was not rediscovered until 50 years later. Paulescu's problem seemed to be that he worked at the University of Bucharest, an institution of little prestige relative to the premier standing of the University of Toronto that Banting and MacLeod were from, and that Paulescu was also known for holding anti-Semitic views, which is of course deplorable, but what does that have to do with the merits of your scientific findings? (Murray 1971)</p>

<p>Now, some of you may be thinking that these are mere anecdotes and do not demonstrate a true pattern. Au contraire- I just happened to pick some of the more sensational stories. In fact, biases have been revealed again and again throughout the scientific process. For example, in an article in Nature, Wenneras & Wold 1997 revealed significant gender and nepotism within the peer review process: a matched analysis revealed that women had to be 2.5 times more productive than men in order to receive the same competency scores, but that that ratio could be reduced if that woman had a personal affiliation with one of the reviewers. Sandstrom & Hallsten 2008 extended the findings of W&W and found that nepotism within the scientific grant review process was an even bigger problem than even W&W found. </p>

<p>
[quote]
Further, the cause of this observation is debatable, and many have maintained that it is due to common approaches to methods and interpretation, rather than simple favoritism toward members of the "club".

[/quote]
</p>

<p>And that's merely a distinction without a difference. For after all, if it really is only a matter of common approaches to methods and interpretation, then one has to ask how one learns these common approaches if they are not members of the club. Put another way, if you are a member of the 'club', then you will be provided more opportunities to learn these common approaches as opposed to somebody else who isn't a member of the club. </p>

<p>One feature of this sentiment was expressed in Nature by Herrera 1997, where he stated that part of being in 'the club' was being able to write your articles in fluent English which obviously discriminates against the vast majority of the world who does not speak English as a native language. Similarly, one should expect that if a dominant paradigm in terms of research methods are being used by certain highly prestigious universities that also happen to control the chokepoints of publication, then it behooves you to attend those universities so that you can learn those research methods and hence learn how to pass the chokepoints. </p>

<p>
[quote]
It is particularly difficult to propose such favoritism in blind reviewed journals. Yes, sometimes it is clear who has written the paper, but most of the time it is not.

[/quote]
</p>

<p>Uh, I would argue that it is particularly easy to propose such favoritism. Let's be honest. Many (probably most) fields of scientific study are comprised of quite small research communities where most of the researchers know most of the other researchers and, by and large, know who is doing what. For example, when Watson and Crick were researching the structure of DNA, they knew full well that they were racing to compete against Linus Pauling's team who was also attempting to answer the exact same question, and everybody knew that the stakes of the race were nothing short of a Nobel Prize. Look, scientists in any field all go to the same conferences where they present their findings, and then when you receive a paper to be reviewed that is 'exactly the same' as a conference presentation you attended, it doesn't exactly take a genius to figure out who the authors are. </p>

<p>
[quote]
If her boyfriend reviewed her paper he was profoundly unethical. If the editor found out about it the journal should have retracted the paper and included an editorial explaining how the former reviewer- never to be used again by that or any other reputable journal- had corrupted the process. This would be hugely humiliating for all concerned. If the author had known her lover was refereeing her paper and went along with it then she was profoundly unsuited for an academic career. Certainly it would rule her out for being hired or promote

[/quote]
</p>

<p>Maybe it is unethical. But to echo W&W and S&H, whether we like it or not, the effect of nepotism within the sciences is widespread. </p>

<p>Look, like I said before, the scientific community for most research topics is pretty small such that you often times can't help having some sort of personal relationship (either friendly or unfriendly) with at least one of your reviewers for any of your papers. Like it or not, that's how it is. This is why so many researchers decry the political nature of academia. </p>

<p>The bottom line is this. Let's not kid ourselves; peer review and the scientific promotion system are far from immaculate. It would be nice to believe that, in the world of science, all that matters is the quality of your research. But that's not reality. These systems have numerous and significant sociological biases such as the bias for prestige, as I'm sure Mendel, Wegener, and Paulescu would agree. In fact it would be tremendously surprising if they didn't. After all, they are manmade constructs and are hence subject to the same sociological forces as any other manmade constructs (Kuhn 1962). Since this is the system that we have, the best thing for you do to is to have the biases work in your favor, not against you.</p>

<p>sakky - what field do you work in? sociology?</p>

<p>also, is there any possible relation between the number of article citations (ISI or something of that sort) and and academic prestige?</p>

<p>
[quote]
sakky - what field do you work in? sociology?

[/quote]
</p>

<p>I'd rather not say.</p>

<p>But if this topic interests you, I strongly suggest you read The Structure of Scientific Revolutions by Kuhn (1962, repub 1996). In fact, I would argue that this book should be required reading for every educated person, particularly in the sciences, for it shows that science does not progress 'in a vacuum', but rather is a product of sociology and that scientific progress is characterized by sociological paradigms that serve to filter out and dismiss those who are not part of the mainstream. The implication is that if you are not part of 'the club', then your ideas stand a stronger chance of being discredited.</p>

<p>As Max Planck (yes THE Max Planck) once infamously said: "A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it. "</p>

<p>Look, I wish none of these things I am saying were true. I wish science wasn't affected by social forces. I wish that gatekeepers within academia didn't care about prestige and nepotism. But, sadly, these things are true, and so it's better that you find out now rather than later. </p>

<p>
[quote]
also, is there any possible relation between the number of article citations (ISI or something of that sort) and and academic prestige?

[/quote]
</p>

<p>It is generally considered (erroneously, in my opinion) that citation rate is a direct consequence of academic quality and hence prestige. Highly cited articles are generally considered to be more "important" (even if those citations are serving to discredit the paper).</p>

<p>Hey, quit picking on Wright State. Several orthos I know went to med school there and professional athletes from the NFL and MLB fly in from all over the US to have them fix their shoulders and knees and return to full contact/play. </p>

<p>Just like anything else, you may have pedigree but you still have to back it up with SKILL. </p>

<p>My son's urologist went to Princeton Med and my daughter's ortho went to Harvard Med and they are practicing medicine in Toledo, Ohio and do tons of pro bono and Doctors without Borders along with their regular work.</p>

<p>So what, the only way I found it out they went to Ivies was by Googling them; they sure don't advertise it. They are both great docs and were probably great students/people before they went to Ivies and they are still great people now. I doubt the Ivies made them that way, I'm sure they were already that way as human beings.</p>

<p>Hey, read the thread. This is about MD/PhD admissions...we're not talking about practicing medicine, we're talking about research.</p>

<p>Does your son's urologist practice out of his garage?</p>

<p>
[quote]
Hey, read the thread. This is about MD/PhD admissions...we're not talking about practicing medicine, we're talking about research.

[/quote]
</p>

<p>I did read the thread which has also devolved into a discussion of prestige as well as using Wright State as the counter-example to Harvard in multiple threads by multiple people.</p>

<p>And no, he does not practice medicine out of his garage. Though I suspect you may be living in one if you take out gazillions of dollars in student loans chasing a dream for MD/PhD at a pedigree school for a guaranteed yellow brick road to success which only exists in fairy tales. </p>

<p>Either way, good luck with that.</p>