Test Optional

Not sure I agree with the comparison of the tests to a track event to gauge objectivity. In all the same ways the test can be subjective (practice tests, coaching, tutoring, prep time, home environment, family values toward taking the test, etc.) so could the shot put results (better coaching, more practice, better training, nutrition). If we view things like that, I guess nothing is objective. The objectivity of the test (not the intended purpose, result or forecast) is simply the score. Everyone takes the same test. Everyone throws the same shot put in the same arena. The results, regardless of background and training and prep are the results for that moment in time. They are simply a single data point .

Moment in time, yes. But also representative of the effort a kid puts into those tests.

Yes, some kids ace without prep and are somehow proud to say. But across the country, success in the tests is about- or at least starts with- focus on the them. And this effort, this conformity, has been a standard. Top colleges are looking for that conformity- good grades, the right challenges, a sort of drive and thinking, followed-through on, in clear eveidence. And more.

A “type.” The type driven to do well, in all aspects, in the ways their targets want. Now, the difference is the colleges that feel they can find these types they want, without looking at scores during the review.

Remember, no matter how hierarchical high school is (top this, top that,) so many colleges are now holistic. They feel they can filter without the SAT/ACT test scores. Try to remember the many critical points that still leaves.

It’s not just about scores.

SAT scores are the best predictor of college success for a freshman class. The correlation between average SAT score and graduation rates for freshman classes is very high…about +.8. Test optional colleges tend to be schools that are on a decline in terms of selectivity and appeal I believe. Dropping the SAT requirement makes no sense except that the school is more likely to attract applicants who don’t do well on SATs.

@collegehelp Would you elaborate on the correlation you stated above. Does this apply to the likelihood of an individual student graduating or does this apply to the average SAT of the freshman class and the school’s graduation rate?

College Board published the validation study comparing scores on the Redesigned SAT and freshman grades only last summer. There are no studies on graduation rates and the Redesigned SAT. The first class to take the new test, which debuted in March 2016, was high school class of 2017/college class of 2021.

dadof2d-
The correlation between average of math/verbal SAT for freshmen classes entering in 2011 and their graduation rate in 2017 after 6 years is +.83 (very high). This is based on 1124 colleges and universities in the US Dept. of Education IPEDS database. It applies to freshman classes, not individual students. It is harder to predict outcomes for individual students based on SATs because of individual variability. What this means is that you can be very confident that more students will graduate from a freshman class if schools select students with higher SAT scores but it is harder to know which specific students will be the graduates. Nevertheless, more students will graduate. It is just harder to know exactly who. It has to do the what is called “unit of analysis” in statistics.

@collegehelp Thank you for clarifying. Have you looked at the correlation between family income and graduation rates? Of course it is also generally believed that there is a correlation between income and SAT scores, but I am curious how much of the correlation between SAT scores and graduation rates is actually due to family income. Some kids drop out for financial reasons, at a rate that probably decreases with income level.

The correlation between SAT scores and graduation rates is not attributable to financial issues. Costs are much less at public institutions but the correlation between SAT and grad rates at public institutions is still +.81 (very high). The correlation is almost exactly the same at public and private schools. This suggests that the reason is most likely student ability and motivation rather than finances.

^Yeah, I don’t understand how you can separate “motivation” from not having enough money to eat properly; it’s a phenomenon that is common across all institutional types.

@collegehelp

If one uses multivariate analysis with standardized test scores and GPA scores OVER TIME, there are some very important observations to be made. In this model, one assumes that the post matriculation university GPA is the dependent variable to be tested against the secondary school GPA and standardized test scores. Whatever variables one chooses to predict the students’ university GPA, only variables are kept which are independent of each other and are not "co-linear. " This analysis makes simple correlations appear very primitive, although they are still widely used in this business…

The fun starts when comparisons are made over four years of college matriculation.

Some observations on studies I did years ago at a STEM university:
1. Variance in secondary school GPA’s are far better predictors of University GPA’s than are the variances in standardized secondary school test scores;
2. The strength of the relationships diminished appreciably from the Freshman to the Sophomore years;
3. By the third year, the statistical relationships between the dependent and the independent variables were actually insignificant.

Possible Observation:
It appears that the students’ academic performance in these test groups had modified significantly over the first two years of studies… WHY?

Test group background notes: the test score and GPA ranges were not dramatically wide. They were mostly honors secondary school students with math SAT scores generally within a 100 point range of variance. The shorter the range of variation of entering test scores and GPAs, the more likely the above college results will result.

Why make a big deal over selecting an 800 SAT score over a 700 SAT score when there are other significant factors involved.? I sometimes think these scores become a convenient reason for admissions offices to justify a selection.

It is also known that standardized tests, on average, are higher for higher income groups. Test scores can function as a proxy for economic resources in large groupings. Lack of economic resources may be behind the first year college dropouts.

:

^^Yes

You could conceivably see average scores of 1600/36 among applicants but you wouldn’t see a rise in scores among matriculants unless you could somehow convince high scoring kids who were previously turning down the TO schools for higher ranked schools to say yes to the TO schools. But then having a higher reported applicant score range is not suddenly going to make a school more attractive to students.

In fact, more kids with borderline scores choosing not to report them might lead to higher average scores among applicants but lower average scores among matriculants because the school would no longer use scores to differentiate between a larger pool of students, resulting in the admissions of a larger number of low-scoring students.

But which test optional colleges report scores for all matriculants? As far as I’ve seen, Bowdoin does, but Bates doesn’t.

@collegehelp, what is your evidence for this claim?

Are you saying that the study compared graduation rates across colleges based solely on the colleges’ average standardized test scores? I certainly hope that wasn’t the methodology. All that would do is point out the obvious, that less competitive schools are more poorly resourced than more competitive schools.

Do we really think the main reason kids at Harvard graduate at a higher rate than kids at U Mass is a difference in standardized test scores? (Hint: Harvard covers 100% of financial aid need, U Mass does not.)

WPI is a “test optional” university which reports the submitted scores of all matriculants. Students may elect to submit SAT or ACT scores or to submit neither. However, they are also encouraged to submit other evidence of their special talents (e.g., research, recording of your award winning musical talent, distinguishing evidence of your leadership skills, your recently applied for patent, your world class language skills, etc.)

The latest available data reports that 70% of matriculating students submitted SAT scores while an additional 25% submitted ACT scores ergo 5% of the matriculating students did not submit any standardized test scores. I wonder what this 5% had accomplished outside of GPA and standardized test scores.

But, some matriculants may have submitted both an ACT and SAT, meaning greater than 5% of did not submit any test score.

The CDS also does not give us visibility to how many applicants applied TO…I don’t know that WPI regularly reports that number, but here they say 30% of women and 22% of URMs don’t submit test scores (article a couple of years old) https://www.insidehighered.com/admissions/article/2017/08/21/wpi-sees-notable-gains-female-enrollment-after-shift-use-non-need

When WPI went TO, their stated reasoning was to increase number of women, URMs and low SES students in their classes…and that has happened with the number of women per class in the mid-40%s now.

@ MWfan1921
“Students may elect to submit SAT or ACT scores…,”

Based on my understanding of “or” they elect to submit one or the other, but this may not be the case as this is the English language and not a Venn diagram.

The data I reported above was taken directly from section C9 of the WPI website CDS report for 2018. See https://public.tableau.com/profile/wpi.institutional.research#!/vizhome/WPICommonDataSet/CommonDataSet.

Section C8 of the CDS guidelines do not state the option of submitting both sets of test scores. If one follows the language of the boxes listed it appears that the applicant has the choice to selecting one or the other as there are no “and” options. It is not likely that applicants would be penalized for submitting both, but it is possible that the system only reviews the SAT or the ACT if both are submitted.

This probably illustrates why I preferred the seeming precision of algebra over the higher degrees of freedom found in the literary world. Precision does count if one wants the package to arrive safely on its celestial target. Look what lawyers can accomplish with the wiggle room!