For high-scoring kids looking for merit money, test optional schools are the way to go.
Where? The Common Data Set, for example, seems to make no particular provision as to how overlap scores (ACT/SAT) appear. If this is the case, then there’s no visible total percentage of those who submit at least one or the other that can be considered and compared.
@collegehelp how is this practice deceptive? If you can figure out that schools that require test have a 15% higher graduation rate than those that are test optional, can’t other applicants?
“Where? The Common Data Set, for example, seems to make no particular provision as to how overlap scores (ACT/SAT) appear. If this is the case, then there’s no total percentage of those who submit at least one or the other that can be considered and compared.”
That is exactly the same at test-required schools, isn’t it?
Graduation rate is primarily a function of selectivity, rather than percent submitting test scores. Several studies have found similar graduation rates between test submitters and test non-submitters, at test optional colleges. For example, the Bates 25 year’s of test optional study found that test submitters averaged a 89% graduation rate. SAT I non-submitters also averaged a 89% graduation rate. Students who did not submit both SAT I and SAT II averaged an 88% graduation rate.
Yes, though stronger inferences can be made with this group. By looking at the combined ACT-SAT percentages, it should be apparent whether the figures are “natural” (i.e., they exceed 100%) or “controlled” (i.e., they total 100%).
“Yes, though stronger inferences can be made with this group. By looking at the figures, it should be apparent whether the figures are “natural” (i.e., they exceed 100%) or “controlled” (i.e., they total 100%).”
Forgive me, but this I do not understand. Can you explain?
As @merc81 points out, the common data set does not include a way to discern how many applicants included both an ACT score and an SAT score.
For instance, if 50% submitted only an ACT score and 50% submitted only an SAT score, the total submitting some score would be 100%. This would be reported on the common data set exactly the same way as the situation where 50% included both ACT and SAT scores, and 50% did not submit any score.
There really needs to be a field in the common data set for the number not submitting a score. Even better, include fields for those submitting ACT only, SAT only, both ACT & SAT, and for those not submitting any score.
If we know for sure that a school receives and reports all scores, then the combined total of the SAT and ACT will exceed 100% because we know that a measurable percentage of their students will have reported both scores. This is simple, and therefore in a sense “natural.”
However, some schools, even though they are not test optional, seem to take the overlap group and “apportion” it to one of the ACT or SAT subgroups. (I’ve seen Brown do this, for example.) In these cases, the figures will always add to 100% (or a rounded variant thereof). This involves an extra step, and is therefore in a sense “controlled.”
With test optional schools, even these types of conclusions cannot be drawn without even further inferences.
I hope that’s a little clearer, @Postmodern.
At at least one of the schools my child considered, they provided actual numbers of kids who submitted the SAT and those who submitted the ACT on their Common Data Set. It was test-optional, so I was curious to see how many enrolled students submitted scores (though I don’t remember if they said how many applicants submitted scores overall, but enrolled is a better number to know anyway.) It was something like 610 out of 725. Or that ballpark. Most.
I doubt many student send BOTH the ACT and the SAT to test-optional schools. And if so, that’s someone who really doesn’t know how to read a room.
One could reasonably expect that any school following that practice would drop the lowest scores from either side, thereby also misrepresenting score range. That is, if 65% of students submitted both ACT & SAT, the 15% overlap would end up being “apportioned” to whichever test was higher for them… and of course that overlap/apportion group would be drawn from the lowest 15% of overall scorers.
Now my chance to ask, what’s this obsession with the CDS? It’s not policed. You can learn as much from many colleges’ other web site info, including the freshman profile.
But it seems, in the eternal search for formula, people cling to the CDS. As if. As if one thing is truly more important in decisions or it matters who took SAT vs ACT.
If a school is concerned about student success, then it should require test scores. They are the best predictor of student graduation rates. The correlation between average SAT and graduation rate is over +.8 which is extremely high. There is no reason to make scores optional unless it is to deceive the public and trick potential students into thinking that the school is more selective than it really is. High school grades are not standardized and expectations vary from school district to school district. Requiring test scores is not an impediment to diversity. Colleges adjust admissions requirements for underrepresented groups.
Not true. (And I’d note that literally took less than 1 second to Google, and was the very top result):
What Predicts College Completion? High School GPA Beats SAT Score
I’d add that SAT scores are known to correlate highly with family income, and financial pressures are probably a leading reason why students fail to complete the colleges where they start within 6 years.
I agree @lookingforward. I don’t believe parsing the numbers that closely is helpful at all, although I understand the need to feel as if you have some control over the situation.
It certainly feels as if the trend is for more top 50 schools to go test-optional. I suspect it’s to highlight the “holistic” focus of their process, but I wouldn’t know for sure. The school where I work uses test scores. There was some talk a few years ago about superscoring but I think I remember that the consensus was that it would put more pressure on our applicants to take the test more than once, and we don’t have a rich student population who can afford to do so.
"With test optional schools, even these types of conclusions cannot be drawn without even further inferences.
I hope that’s a little clearer, Postmodern."
I understand what you are saying, I just don’t understand how it benefits anyone or indicates subversive intent on the part of a college. And no way of determining the amount of the kids in the higher percentage that submitted both, so it is not measurable. There’s a lot of assumptions in this thread without evidence about what is held back and how things are reported, but that goes for all schools, not just test optional ones.
There are so many things we don’t know about the data in the CDS – which students under the 25% are hooked, for example… you can’t know everything, and as I was taught very quickly after I first logged in here, there is no secret formula for holistic admissions.
Also I haven’t seen a CDSs where it came to exactly 100% – I checked a couple of Brown years and it was well above. Can someone point me to one? I am not doubting there is one, just curious to look at it.
Staying on topic, I still do not see how test optional schools are committing fraud in ways that test required schools are not.
@collegehelp how is this practice deceptive? If you can figure out that schools that require test have a 15% higher graduation rate than those that are test optional, can’t other applicants?
Why? If both the ACT and SAT are of similar percentile rank, including both would be a benefit. There certainly wouldn’t be a reason not to include both.
It was strongly suggested at my kid’s private high school to pick either one or the other to take lest you come off as a test-taking drone who could have better used his time.
I suspect that would go double at a test-optional school.
If this is deemed fraudulent then the most of marketing campaigns are, and the free market system as we know would collapse.