College Applicants Sweat The SATs. Perhaps They Shouldn't

<p>Standardized tests are an important consideration for admissions at many colleges and universities. But one new study shows that high school performance, not standardized test scores, is a better predictor of how students do in college.</p>

<p><a href="http://www.npr.org/2014/02/18/277059528/college-applicants-sweat-the-sats-perhaps-they-shouldn-t"&gt;http://www.npr.org/2014/02/18/277059528/college-applicants-sweat-the-sats-perhaps-they-shouldn-t&lt;/a&gt;&lt;/p>

<p>That‘s why many schools take a holstic approach. Some schools put more emphasis on GPA and class rank.</p>

<p>The study referenced in the story compared performance at test-optional schools between kids who submitted tests and students who did not. Both groups had the same outcomes.</p>

<p>I thought this article was very interesting. I do wish they’d talked about SAT/ACT prep classes and how that biases test scores. I would guess - and would love to see numbers - that kids who take prep classes do in fact do better on the tests. Which makes me wonder what’s really being measured. Is it the ability to pay for a prep class or tutor?</p>

<p>This is old news. If you get just one indicator of success, go with the high school GPA because it shows four years of work, not a single morning’s worth. In my office, if there’s a gap between test and GPA, we look at the GPA (assuming, of course, the transcript shows a typical college prep curriculum). </p>

<p>As far as I know, this is old news republished with a new sample data set. Until schools dial down the importance of scores in admission students should still “sweat it”. I agree whole-heartedly about test prep. Studies have shown a significant relationship between scores and family income.</p>

<p>“I would guess - and would love to see numbers - that kids who take prep classes do in fact do better on the tests. Which makes me wonder what’s really being measured. Is it the ability to pay for a prep class or tutor?”</p>

<p>It’s clear that prepping increases scores on these tests. If you are concerned about economic bias, before you start crying foul about expensive prep classes and tutors, you need to ask the right question. The right question to ask is, are expensive preps more effective than inexpensive/free preps?</p>

<p>You can prep for these tests on your own with a book or two. I spent about $50 total on all my daughters college test prep–PSAT, SAT, and 3 SAT2’s. I don’t buy the idea that expensive test prep classes are necessary (or even particularly helpful for the stronger students). Our library has test prep books, and they can probably be bought used for next to nothing or even donated from kids who are done with them, so we could have done it for considerably less if that cost had been a concern for our family.</p>

<p>For students who aren’t self-motivated, if the only way they are going to study is if mommy and daddy pay a lot of money for a class or a tutor or any other kind of academic babysitting, well then yes, you’ll see a big difference between those who can afford that and those who cannot.</p>

<p>I agree that the self practice is more helpful than the prep classes.</p>

<p>Not crying foul, just curious. There are many parents where I live who do pay for the expensive prep courses. Not something we could afford so we bought the SAT prep manual for my younger son. He’s very organized and committed to x number of pages per evening. He did bring his test scores up doing that. I do think for kids who aren’t as disciplined (read: my older son) that method would not have worked. In fact, now that I think of it, his younger brother used his brother’s pristine book :)) An affordable prep course option would have been appreciated but nonexistent, at least where I live. </p>

<p>I’d also be interested to know the improvement between kids who self-study vs. kids who take a course. </p>

<p>Acceptance into a college is a prerequisite for doing well there. Therefore, if an applicant is interested in a selective college they should strive for a good score on the SATs regardless of the test’s predictive ability.</p>

<p>holistic approach is wise. If one factor is overly emphasized, the colleges may miss the best students. I’ve known some of my d’s camp friends (CTY camp) do very well in standardized tests, but have poor GPAs because they don’t do school work, showing they’re not responsible for the work they’re committed to. On the other hand, in grade inflated schools like my Ds, many people get As in AP classes but fail the national exam. That tells something about the school grading system - it’s not to be trusted. </p>

<p>Yes SAT prep schools are prevalent in some areas in the country. That makes those high SAT scores less impressive. </p>

<p>Not exactly news, as studies by universities have found that high school grades > SAT or ACT in predicting college grades. However, Harvard has found that achievement-based tests (AP, IB, SAT subject) and the writing sections of the SAT and ACT are even better predictors: <a href=“Guidance Office: Answers From Harvard's Dean, Part 2 - The New York Times”>Guidance Office: Answers From Harvard's Dean, Part 2 - The New York Times;

<p>There are 2 uses of test scores though. This article only talks about admission, where scores have always only been 1 factor, and in many schools its not a “very important”. </p>

<p>For most merit scholarships, though, its the primary factor that gets you the money. There just isnt any easy way to sift through 10000+ applications and figure out who gets the scholarships. Even if we convinced every college to remove the requirement for admission there is still huge pressure to do well on the tests for $$$. </p>

<p>This may be old news to the CC crowd, but it is news for the population at large. Max was brimming with excitement to tell us about the NPR story he heard about standardized testing. :stuck_out_tongue: </p>

<p>I wish the story had touched more on the $2 Billion industry. Valid or not, the ACT and SAT aren’t going anywhere because there is far too much money to be made. Standardized tests have evolved into an expensive (for families) gate keeping tool. </p>

<p>

They do make a few comments about what is being measured in the study saying,</p>

<p>“Crossing the Finish Line (p. 114) found that when controls were added for the quality of high schools, the predictive power of SAT/ACT testing disappeared, and often had a negative correlation with college performance.”</p>

<p>As I recall, the Geiser studies found that SAT M+V scores had a better predictive ability for students’ income level than it did for success in college. When you add controls for things like income level and quality of HS. This relates to why different studies come to different conclusions about the predictive ability of SAT M+V scores. If you look at the scores alone, they do show a significant correlation with college performance. However, if you also look at HS GPA, HS curriculum, quality of HS, income level, the rest of the application, …, then SAT M+V adds little predictive ability beyond the other variables.</p>

<p>The problem to me is how to measure “success in college.” Simple GPA seems like a flawed measure to me. Might it be possible, for example, that many high-scorers go on to very competitive colleges where they have tougher competition and face higher expectations, thus getting lower grades? That’s sort of what Malcolm Gladwell argues when he tells high scorers to go to Hartwick instead of Harvard. And are we comparing fashion merchandising majors to engineering majors? Who is going to have a lower average GPA? </p>

<p>

</p>

<p>Harvard was just looking at Harvard students, and Geiser was just looking at UC students (granted, UC systemwide, not just one campus), so the effect of different schools chosen is none (in the Harvard case) or only some (in the UC case).</p>

<p>However, the effect of major selection is not considered; the Geiser studies find that the SAT math tests (both reasoning and subject) had no predictive power on college GPAs. The likely obvious explanation is that students self-sort into majors based on their math abilities. The weaker in math students who do relatively worse on math tests are more likely to choose math-light majors where their math weakness will not be much of a penalty on their GPAs.</p>

<p>A slightly different take is a study at the University of Oregon, where doing not that well on the SAT math predicted poor performance as a math or physics major, though the predictive power for other majors or for the other SAT sections was much weaker.
<a href=“[1011.0663] Nonlinear Psychometric Thresholds for Physics and Mathematics”>http://arxiv.org/abs/1011.0663&lt;/a&gt;&lt;/p&gt;

<p>

The Geiser studies did consider major selection. See tables 7 and 17 at <a href=“http://cshe.berkeley.edu/sites/default/files/shared/publications/docs/ROPS.GEISER._SAT_6.13.07.pdf”>http://cshe.berkeley.edu/sites/default/files/shared/publications/docs/ROPS.GEISER._SAT_6.13.07.pdf&lt;/a&gt; . The tables and overall conclusion of the Geiser studies was that SAT M+V has little predictive power (not no predictive power), notably less than HS GPA. Note that the regression coefficients for SAT I M+V are a small fraction of HS GPA for all majors. Math and physical science majors did have more influence from SAT II math than other majors, but still a small fraction of HS GPA. The Oregon study and Gladwell appear to only consider SAT I M+V alone, rather than how much predictive value math SAT I adds beyond other areas, like HS GPA or the subject tests, leading to different conclusions, as discussed in my earlier post.</p>

<p>Without discussing the relative merits of this latest study or the (debatable) positions and mercenary research of Geiser et al, it remains that this type of story is hardly as constructive as one might think. Even if the conclusion were that the SAT has little predictive value in terms of college success, it remains that highly selective colleges will continue to rely on the scores and are not about to abandon the one metric that helps them qualifying the vastly disparate world of HSGPA. The entire discussion is a red herring. It is OBVIOUS that a strong performance in high school should tell volumes about a candidate. The real problem is to ascertain how hard it was to “present” a strong performance, especially when the overwhelming majority of US high schools are simply mediocre to g*ddarn awful. </p>

<p>The bottom line is also rather simple: the SAT is an intricate part of the college application and is one of the easiest element to maximize. If people would spend a tiny fraction of the time spent (or wasted) in K-12 to prepare for a simple test, they would realize the … opportunities offered by standardized tests. </p>

<p>All morons such as Schaeffer are accomplishing is “helping” the unsuspecting to lose plenty of opportunities, and this for their sole benefit, ego, and political agenda. </p>