<p>I have always felt that the College Board's assertion about college performance correlation and SAT or ACT scores were questionable. Now a long-term study supports my position.</p>
<p>The Chronicle of Higher Education, Feb 28, 2014, notes that a long term study of over 123,000 students over twenty years show that students who attend colleges that don't require SATs and ACTs and didn't submit these scored did just as well as those who did submit SAT and ACT scores. The only correlation that was found was with the high school GPA and college performance. The bottom line of the study was that the "testing is truncating pools of applicants who would be successful if admitted. "</p>
<p>The main functional effect of standardized testing is as a deterrent against or control on high school grade inflation and course derigorization. I.e. the benefit may be less for the evaluation of an individual student (as there is considerable “noise” in how the test scores relate to academic ability and motivation) than for the “system” (high schools deterred against or exposed on grade inflation and course derigorization and universities who would otherwise be more suspicious of high school credentials than they already are).</p>
<p>In some other countries where high school course content and grading are more well controlled across high schools, external standardized testing is not required for university admission. At the other end of the scale, there are some countries where high school credentials are evidently so inconsistent or untrustworthy that university admission is based entirely on a high stakes standardized test.</p>
<p>I haven’t seen the study, but I’m not sure how much we can infer from comparing the college performance of students who submit standardized test scores to those who don’t submit scores to the same test-optional college. I distinctly recall hearing an admissions officer at one prominent test-optional college acknowledge that they just assume applicants who don’t submit test scores make that choice because their scores aren’t very good. He then went on to say it just “puts that much greater weight on the rest of the application.” The implication, though he didn’t quite come out and say it, is that applicants who don’t submit test scores need to have exceptionally strong applications in other respects. That could mean, for example, that the HS GPAs of enrolled students who didn’t submit test scores are on average higher than the GPAs of test-submitters (e.g., 3.9 unweighted v. 3.7 unweighted). If that’s the case, you’re not really comparing like groups.</p>
<p>The study is at <a href=“http://www.nacacnet.org/research/research-data/nacac-research/Documents/DefiningPromise.pdf”>http://www.nacacnet.org/research/research-data/nacac-research/Documents/DefiningPromise.pdf</a> . We discussed it earlier this month. They compare some groups with similar HS GPAs and some groups with notably different HS GPAs. There have been several other studies that looked at how SAT score correlated with college GPA and graduation rate. All the ones I’m aware with controls for HS GPA, HS quality / HS course rigor, and parents’ income level showed SAT I M+V added little improvement in prediction beyond the other factors. However, looking at SAT I M+V without controls for other factors does show a notable correlation with HS & college GPA, grad rate, parents’ income level, and HS quality, among other things.</p>
<p>This has come up in the law school context, too. It’s hard to resolve the problem that students at any given college usually present a pretty narrow SAT range. Where you have low variability, you’ll find low correlations. If you threw some 1700-scorers into Princeton, you’d see more correlation. I’d be interested to look specifically at schools with broader SAT ranges, like many public flagships.</p>