SATs and MCAT

<p>While his SAT score isn’t a 100% indicator, I’m starting to think that this thread might be.</p>

<p>just because you’re going to improve your SAT score by x points doesn’t mean that automatically your MCAT will be x points higher. that makes absolutely no sense.</p>

<p>

^ 41</p>

<p>The correlation between SAT and MCAT is probably due to their g loading. To the extent that they both are partly measures of g, then they will be correlated. To the extent that they both measure other, and different, things, they will diverge. The MCAT includes tests of specific knowledge, so its results reflect, in part, how much you learned about these subjects. Of course, nearly everyone who takes the MCAT has taken the prerequisite courses, so the distribution of scores on the MCAT may track more closely with g among those who actually take the test, since g may be a large part of the differences between the subjects.</p>

<p>The SAT in the older studies was a different test, and, according to the College Board, more g loaded than the current SAT. It is unclear whether the correlations will be the same now.</p>

<p>You might find this interesting.</p>

<p>

</p>

<p>As a practical matter, for an individual, the only way to pursue this is study for the MCAT, take some practice tests, and see how you do. If you cannot get a decent score on the practice tests, then start worrying about whether medicine is for you. You really do need a good score to be competitive in med school admissions, and the MCAT score really does predict the likelihood of success in med school.</p>

<p>^ Is there any evidence showing that the verbal section of SAT test, rather than the math section, measure the g factor better? I vaguely remember that I have read something like that.</p>

<p>I also heard about the so-called “Mozart effect” : Learning to play music instrument (esp. piano) could possibly increase the g loading of a person.</p>

<p>Well, we may have adventured into the potentially controversal (sp?) area of nature vs nurture, and how useful the IQ or g factor really is for measuring a person’s capability.</p>

<p>I don’t know of studies relating g to MCAT verbal performance, but then I have not looked that hard to find them. There are quite a few that look at different sections of MCAT for predictive ability of med school or USMLE performance. They results seem to indicate varying values of the subsections depending on the task at hand. However, they are unanimous that the writing section tells you nothing about a student’s future performance. I am not sure why they continue to include it. It is apparently worthless. </p>

<p>If I recall correctly, the science sections are better than the verbal for predicting performance on the science part of medical school. The verbal is at least as good as the science, perhaps better (again going by recollection, without recently reviewing these papers) for performance on the clinical rotations. </p>

<p>The SAT verbal has shown higher g loading than the math, at least on older versions of the SAT. I do not know whether that is the case with the current version. When they redesigned it there was a lot of political pressure to make it not an intelligence test. </p>

<p>Unfortunately, in highly cognitive fields, high g is probably useful. Physicians come almost exclusively from the upper few percent of the population in g. This is probably largely because one needs high g to get into medical school. Whether one needs high g to practice medicine- hard to know since there are no low g people in the field. From what is involved, I would assume g is very useful.</p>

<p>I thought the “Mozart effect” was pretty well discredited for overall improvements in cognition. Anyone know?</p>

<p>afan: Kaplan always told us that writing and verbal were correlated with 3rd/4th year success. Of course, it was a test prep company, so who knows.</p>

<p>Here we go:
“The Predictive Validity of the MCAT for Medical School Performance and Medical Board Licensing Examinations: A Meta-Analysis of the Published Research”
Academic Medicine, 2007, Volume 82(1), January 2007, pp 100-106</p>

<p>“Two separate studies indicated that r = 0.0 for the writing sample subtest with medical school performance for both the basic sciences (r = -0.13; 95% CI, -0.30 to 0.05) and clinical (r = 0.07; 95% CI, -0.05 to 0.19) dependent variables.”</p>

<p>"The writing sample subtest correlations with medical board licensing examinations are near zero. "</p>

<p>“the random-effects analyses for the biological and physical sciences subtests on the USMLE Step 1 were nearly identical at r = 0.48 (95% CI, 0.41–0.54) and r = 0.47 (95% CI, 0.43–0.51), respectively, and at r = 0.27 (95% CI, 0.19–0.35) for the verbal reasoning subtest. The writing sample showed low predictive validity at r = 0.08 (95% CI, 0.02–0.14).”</p>

<p>“The major findings of the present study are as follows:

The writing sample has little predictive validity for both the medical board licensing exam and medical school performance measures.”</p>

<p>For those not yet in medical school, you should be acquainted with a principle that will become too familiar soon: “Just because something is useless does not relieve you of the obligation to do it”. In this case, the writing sample seems to be of no value for predicting performance in medical school, or on the licensing exams (the putative purpose of the MCAT). However, you still have to take the MCAT, and you are better off doing well than poorly on the writing sample.</p>

<p>The verbal did better than the science for predicting grades on clinical clerkships, but neither did very well.</p>

<p>Huh. Well, there it is. -1 for Kaplan.</p>

<p>But again, Kaplan does not design the MCAT. Their job is to help people improve their scores on the test as it is. Good writing scores are almost certainly better, for admissions purposes, than poor ones. Perhaps Kaplan was trying to be more encouraging than “this section is useless, but people have to take it, so here is how to maximize your score.”</p>

<p>When it comes to g-loading, as a general rule, verbal scores (on any test) are most predictive of g/IQ factors. While I have not seen any specific studies on the MCAT vs. g, my guess would be the Verbal subsection is most correlated with g (hence it being the most difficult score to improve). Obviously, for pre-med students, the science sections are what has typically had the most time devoted to them and, as a result, they are more content-based. It is notable, however, that the MCAT (as compared with, for example, the PCAT) is relatively problem-solving and verbal-heavy on its PS and BS. As a result, it is likely the g-factor is quite high. Generally, tests designed to test g/IQ have reliability indexes in the 0.7-0.9 range, which means you are unlikely to change your score much even if you take the test repeatedly (different forms, of course).</p>

<p>Some interesting related articles:</p>

<p>SAT vs. MCAT performance:

</p>

<p>

</p>

<p>Effects of MCAT Coaching (or lack thereof):

</p>

<p>With respect for the areas tested by the MCAT (incl. g):

</p>

<p>Great post. Thanks. </p>

<p>Could you include the citations for those references?</p>

<p>The finding of little impact of coaching is consistent with high g loading. It is also depressing to think that a large part of the studying people do for the test may be fruitless. </p>

<p>That being the case, I suspect it still is worthwhile for individuals to study for the MCAT. Unclear whether they need a Kaplan-style course, but it might help organize the studying. Hidden within the low average payoff from coaching there may be some people whose scores improve a lot. If you are one of these, then you could do yourself quite a favor by prepping. Also, for those who do not get admitted, there is comfort in knowing they did all they could to support their applications.</p>

<p>For the same reason that the sponsor does not want to discuss the g loading of the SAT, I doubt you would find much interest from the MCAT people in the topic. Intelligence tests have such a bad connotation right now, that there is no percentage in admitting that your test is in part an IQ test. Even if the test is clearly valid for predicting outcomes.</p>

<p>The 2nd one (Montague JR) is “A twelve-year profile of students’ SAT scores, GPAs, and MCAT scores from a small university’s premedical program”
If you do a search with the listed title and author (1st author), those abstracts will come right up. My university evidently does not subscribe to the respective journals to actually look at the articles, unfortunately, but if anyone’s does, I’d be there’s a lot of interesting data in those articles.</p>

<p>As far as coaching, one of the articles mentions that low to midrange students (GPA) and those with good basic reading skills tended to improve most. It is implied, then, that the improvement was one of content due to a high g-factor – essentially, an increase in the utilization of one’s potential. A high GPA individual is likely to have maxed out his/her content-based learning and, as a result, is less likely to benefit from further study. Someone with high potential but a low GPA may be a person who is naturally smart but cut class a lot. In that case, a higher MCAT is likely to result from add’l study as the person never reached his/her potential aptitude for the subject in the first place. Furthermore, people with lower reading (verbal) aptitude to begin w/ are going to have a much tougher time answering content questions regardless of how much they may study/prepare.</p>