Question about "Test Optional" schools

@merc81 My current senior has applied been admitted to a number of test optional schools, largely in the midwest. I went back to check my vague memory, and yes, one of them requires scores for students enrolling. I have not gone back to see if any others say anything about requiring submission for statistical or other purposes. Curious . . . .

Some test-optional schools require scores after admittance for placement level purposes. A very small set of schools wont’ take any standardized scores at any point. I encourage people (especially those who see this as a money grab) to read some of the blogs at Wake Forest University on their going test-optional. Most test-optional schools require something extra - another essay, more LORs, and so on. Wake has studies showing that students who are strong achievers throughout 4 years of high school often tend to be strong achievers throughout college, even if they don’t do wonderfully on a standardized test.

With the strong correlation between SAT scores and income, this gives a way for kids who have worked hard and earned good grades, but who don’t have a ton of resources available to them, to have a more level playing field against those kids who can afford expensive SAT prep classes, private tutoring, and multiple retakes of the exam.

@merc81
I’m curious–would you make the same calculations for Amherst that you made in post #17? I know you’ll do your research to figure out my point.

Ummm… Amherst isn’t test optional.

That (#23) simplifies the math.

Amherst

Reported: 1350-1550

Actual: 1350-1550

Have a look at Amherst’s common data set and tell me the percentage of students for which they report SAT scores. Then compare that to some of the test optional schools. Interesting, eh?

@arcadia are you looking at ACT also?

The CDS doesn’t have a category for percentage of students not reporting standardized scores. In Amhert’s case, the figures indicate that including the SAT (53%) and the ACT (49%) potentially every student submitted one or the other or both. That’s all their CDS reveals. The rest must be inferred from their requirement that all applicants must submit testing. This is not the case for test optional schools, so different inferences must be made @arcadia.

However, the 102% total figure in Amhert’s case does have unclear implications and does not appear to be entirely reliable.

102% does mean that at least 2% submitted both SAT and ACT, right?

Mathematically, that is what it could mean. The figure must then be examined for plausibility with respect to the general likelihood of students taking and submitting both exams. If “at least” is an aspect of the reported figure, that would seem to introduce unwarranted imprecision.

Middlebury isn’t a test optional school. Neither are some of the others you mention. Students must submit SAT I, ACT, or 3 SAT IIs. My point is that Middlebury is reporting SAT I scores for 68% of matriculants and ACT scores for 44%. Amherst, which isn’t subject to your “adjustments” mentioned in post #17, is reporting SAT I scores for a much smaller percentage of matriculants (53%). My guess is that Amherst is cherry picking scores–if a student submits SAT I and ACT scores, they’re reporting only the higher test score. Look at the numbers for Williams, Swarthmore, and Pomona. Presumably, the applicant pools for these schools overlap with Amherst, yet Amherst reports a much lower percentage of SAT I scores than the others. Just food for thought.

I’m sure some do. My kid took each test twice. She sent only the ACT to most schools (including Amherst) as it was a somewhat better score and meant she didn’t have to bother with the SAT Subject tests, but a few of her schools got her SAT as well because she used her free 4 reports (before knowing the score). So some schools got both.

Maybe, or maybe Amherst is getting more students from ACT-dominant areas of the country. (Actually the whole country is ACT dominant now, technically, as more take it overall than take the SAT.)

Here’s more test score detail on Amherst than they have in the CDS, also the actual # of students from each state: https://www.amherst.edu/media/view/625467

I’m not sure Midd provides that level of detail. http://www.middlebury.edu/admissions/start/profile

@arcadia: Middlebury is not publicly transparent with respect to the number of students who report neither the SAT I nor ACT. The figure I chose for my calculation, 16%, was based on research of other test flexible colleges. With respect to “cherry picking” at other schools, the organization to which a school is reporting ordinarily creates the standards as to what constitutes an acceptable submission.

Why would you have an idea these schools misrepresent?

They’ve learned scores are not the be all and end all. The rest of what an applicant presents is equally or more important- and it sure seems to work, considering a) the similar college records of those who submit and those who don’t. And b) that many of these schools are still rejecting top scorers who don’t fit what they look for.

They don’t assume not reporting means bad scores. It can mean lopsided or simply not accurately reflecting the level the kid actually performs at. We’re talking holistic colleges and anyone should know that means a whole lot more than scores.

It is incorrect that test scores don’t have predictive value with respect to college success. Consider, for example, this study from U.C. Santa Cruz (UCSC): http://senate.ucsc.edu/committees/cafa-committee-on-admissions-and-financial-aid/cafa-admissions/comprehensive-review/SATGPA.pdf. As shown in Table 2, the addition of SAT I and SAT II to high school GPA increases the explained proportion of the variance in college 1st-year GPA from 11.9% to 18.1% at UCSC (and from 15.4% to 22.3% across all UCs).

Moreover, if colleges don’t believe the test scores have value, why collect them from anyone? Alternatively, they could collect them from everyone and just choose to give them no weight.* Colleges want to have their cake (accept students who don’t feel their scores represent their ability) and eat it too (report higher scores).

The vague Common Data Set and federal IPEDS reporting requirements for SAT and ACT test scores contributes to the problem, since, for test optional schools, it isn’t clear what proportion of students took both tests, and which took neither.

*in which case, the student population would have test scores that only deviated from the general population based on their natural correlation with high school GPA, about r=0.3.

Your link cites 1996-99 and 2001 and freshman gpa. The summary even leaves questions.

This is straight from the opening paragraph of the cited article:

And this was one of the cited correlations:

And then this in the conclusion (bolding emphasis mine):

So while I agree this article is not saying standardized test scores have no predictive value, it does appear to me to be saying that high school GPA is more important than SAT, especially if you are trying to enroll students from a lower socio-economic level who have done well in high school. And if you want a better standardized test predictor, use SAT II tests instead of SAT I.

I am someone who benefited from high SAT scores, but I also recognize that my aptitude is more a knack for being able to figure out multiple-choice questions than anything else (as I scored highly on the ASVABs, including in sections I had no IDEA what they were about). I don’t want to see smart kids who’ve worked hard excluded from colleges that are good fits for them purely because they didn’t get strong SAT prep.

With respect to the standards mentioned in #32: IPEDS specifically instructs colleges to report both ACT and SAT scores if students submit both (so yes, it could add up to more than 100%), and to report the scores that were used in the admission decisions, (so they could be super-score or single-sitting, depending on the college). CDS does not provide equivalent guidance in its reporting standards.

Yes, it does say that.* But so what? I certainly was not suggesting that test scores be used instead of GPA. What I said was that the addition of test scores increased the ability to predict college performance.

Again, I never made any comments about the relative value of SAT I vs. SAT II. They are, after all, both tests.**

Although, the SAT II better explained freshman GPA across the UC schools than high school GPA did (16.0% vs 15.4%)
*
And are themselves highly correlated (r=0.816 from Table 1)

Again. Freshman GPA. This is more important to CC than adcoms. And it’s an old study. The test optional colleges have found they can build solid classes, see equal levels of 4 year performance and grad rates. Freshman GPA, usually mostly cores or gen eds, pales.