"No, the SAT is not Required." More Colleges Join Test-Optional Train

<p>

</p>

<p>That’s why I wrote test-don’t-care in reference to the Texas public universities.</p>

<p>

</p>

<p>That chart of GPA and test scores is linked in my post. Non-impacted majors at non-impacted campuses admit students who make baseline eligibility. When there is impaction, higher standards are used (e.g. SJSU: <a href=“http://info.sjsu.edu/static/admission/impaction.html”>http://info.sjsu.edu/static/admission/impaction.html&lt;/a&gt; ) and test scores are needed for frosh applicants.</p>

<p>In any case, the non-impacted CSU campuses* are mostly not in the middle of a desert.</p>

<p>*They are Bakersfield, Channel Islands, Dominguez Hills, East Bay, Monterey Bay, and Stanislaus. The only impacted majors at these campuses are nursing, and business at East Bay.</p>

<p>In any case, the students admitted to non-impacted majors at non-impacted CSUs with GPA>3.0 and top 10% students admitted to Texas public universities likely far outnumber those admitted to Pitzer and other small test-optional LACs without test scores.</p>

<p>@voiceofreason66 US News penalizes test-optional schools as well. I don’t remember the exact extent of it, but if < than X% (70? idk) of admitted students submit SAT/ACT scores at Y school, US News will adjust the data accordingly.</p>

<p>@fallenchemist The average GPA data is flawed. On the surface it appears that non submitters and submitters difference in GPA is insignificant but what the data does not show is what courses were taken to achieve the grades. In many high schools, grades are weighted so that a student who receives an A in PE is not ranked higher than a student who receives a B+ in AP Calculus. </p>

<p>The data in the referenced report does not distinguish what the areas of study non submitters entered vs submitters. It does not distinguish what the difficulty of courses that were taken by both groups. The referenced data has little to no value because there is no baseline. </p>

<p>What the study should have done was to compare SAT submitters with non submitters on the percentage who took common course such as Freshman Calculus and the grade achieved in this course. My guess is the submitters would have a higher percentage taking Calculus and achieve a higher grade than non submitters.</p>

<p>@SammyxB, that sounds like a nightmare for applicants. Most people only have to take the SAT or ACT maybe two or three times, but if every college had its own test, most kids would have to prepare for eight, ten, some even fifteen different tests while also writing essays, keeping their grades up, etc, and that doesn’t even account for the fact that some applicants might have to take a test a second time. There’s no way applicants would be able to perform their best on a test for which they were logistically unable to prepare to the best of their abilities. </p>

<p>8-10…?x_x Maybe it’s just me but… Unless I was able to replicate myself,I’m not doing that absurd amount of college apps that would be a waste of time,investment,and oh yeah a lot of money.</p>

<p>I rather have a small small list that I am sure I want to apply 100% , not just for the heck of it .The small number would include back-up/safety/etc </p>

<p>“There’s no way applicants would be able to perform their best on a test for which they were logistically unable to prepare to the best of their abilities.”</p>

<p>I bet someone out there in this world could probably manage to do what is said to be impossible.And although having to do all that is crazy…who knows, some may have that much to do once in college. Besides…after all those applying to colleges…how many could you go to?Only one out of the many.This is just my own view though.</p>

<p>@voiceofreason66‌ - they looked at every student in about 3 dozen schools. I find it extremely implausible that the results can be attributed to the college curriculum they chose. In other words, it is extremely unlikely that non-submitters routinely chose harder courses, or easier ones for that matter, than submitters. There were 123,000 students examined. Sorry, your explanation is exceedingly unlikely. BTW, these were not high school grades, they were the grades they achieved in college. So I don’t know why you bring up high school grades.</p>

<p>@International95 Here is a Times article written last fall it does not mention any penalty for test optional schools.</p>

<p><a href=“SAT-Optional Schools Not Such a Great Option After All | TIME.com”>http://ideas.time.com/2013/10/31/sat-optional-schools-not-such-a-great-idea-after-all/&lt;/a&gt;&lt;/p&gt;

<p>It does mention that test optional status also helps those institutions receive more applications from low scoring students.</p>

<p>@fallenchemist You should read the following abstract from researchers Stephen Hsu and James Schombert who wrote Data Mining the University: College GPA predictions from SAT Scores</p>

<p><a href=“http://www.researchgate.net/publication/45912251_Data_Mining_the_University_College_GPA_Predictions_from_SAT_Scores”>http://www.researchgate.net/publication/45912251_Data_Mining_the_University_College_GPA_Predictions_from_SAT_Scores&lt;/a&gt;&lt;/p&gt;

<p>@SammyxB‌ - Perhaps you would only apply to 3 or 4 schools, but the fact of the matter is that most people today apply to more, at least the higher achieving students that are looking at having (and needing) many choices due to admissions uncertainty, financial aid packages uncertainty, and their own uncertainty at the time as to their preferred school. It is completely non-feasible to implement what you suggest.</p>

<p>@SammyxB, 8 is probably towards the lower end of the number of schools most kids apply to. The New York Times reported that the average student applies to 9 schools. Even if you don’t buy that, right now, applicants only need to study for two tests at the most: the SAT and the ACT. Most pick only one and focus on it. I’ve never met anyone who only applied to two schools. Colleges administering their own tests would mean students would have to study for more tests than they do now. Sure. A few kids would have the ability to perform extremely well. I’m not at all saying that that’s impossible. But what I’m saying is that forcing kids to study for nine DIFFERENT standardized tests at one time would diminish their ability to devote as much time as they could for the SAT or ACT. </p>

<p>On the topic of superscoring, I see nothing wrong with it. Why not give kids the benefit of the doubt? In terms of test optional schools, it is simply a ploy to attract more applications and to bolster rankings by showing only the standardized test scores of those who chose to submit them (which would obviously be higher than the entire pool). Standardized tests definitely are not perfect and they can be studied for meticulously however, there needs to be a way for schools to understand kids from different backgrounds. High schools are very different and grades are much easier or much harder to come by depending on the HS. For those who claim that standardized tests discriminate against kids from different socioeconomic status, simple, just view the kids scores in the context of his or her background which schools already do!</p>

<p>@voiceofreason66‌ - So? That is at one school only, so I could question the ability to generalize the result, but it really doesn’t matter. I see nothing there that contradicts what I said or supports what you said. Just the opposite. They don’t say a word about the magnitude of the cognitive effect in those few majors, at least in the abstract. And besides, they don’t separate test submitters from test non-submitters (because there were no non-submitters) so they didn’t study the same thing.</p>

<p>@voiceofreason66 Instead of relying on secondary sources and arguing that “since X article (who died and made Times king?) doesn’t say anything about the penalty, it must mean that it doesn’t exist”, why not look at this:</p>

<p><a href=“http://www.usnews.com/education/best-colleges/articles/2013/09/09/frequently-asked-questions-2014-best-colleges-rankings#1”>http://www.usnews.com/education/best-colleges/articles/2013/09/09/frequently-asked-questions-2014-best-colleges-rankings#1&lt;/a&gt;&lt;/p&gt;

<p>“This practice is not new; since the 1997 rankings, we have discounted the value of such schools’ reported scores in the ranking model, since the effect of leaving students out could be that lower scores are omitted.”</p>

<p>Of course there are schools desperate for attention- from both applicants and US News- whether or not they are test optional. </p>

<p>The point is there is apparent equity in college performance among submitters and non-submitters, at the TO schools studied. The argument that TO is, across the board, all about US News, is old and dried up. It suggests a misunderstanding of how colleges (the better privates, not those trying whatever they can to keep beds filled,) identify their self images and values, establish and grow their in-house strengths and review apps for the kids who will fit and thrive. And how they measure their success at that. Plus how they make significant decisions about their operations. </p>

<p>Completely agree @lookingforward. People that always take that like basically exclude the possibility that a school can make any decision regarding admissions, whether it be test optional or superscoring or marketing to increase awareness or wait listing students because they think it is the right thing for their school rather than some gimmick for their ranking in USNWR. I could make a snide remark about conspiracy theorists, but suffice it to say that while we know that some shennegins do happen with trying to manipulate the rankings, they are most often the result of a misguided individual, such as the Clemson admin who gave terrible assessments to schools that compete with his, and people that falsified test score reports or other data like at Emory.</p>

<p>I think the fact that Bates has been TO since 1985 speaks volumes. Bowdoin is also test optional. Neither are slacker schools nor are they trying to fill beds. No one has mentioned that the SAT favors kids from wealthy families and also kids whose parents are better educated. It stands to reason that a kid from a wealthier family could afford an expensive SAT prep course. Yes, there’s the argument that the student could study on their own. One of my kids did that - he’s very disciplined. My other son did not and would not. So much of that depends on the teen and what motivates them. No way is perfect for every person or institution.</p>

<p>Apologizing in advance for a long post, but this has been on my mind for a long time.</p>

<p>I often read here on CC that test-optional policies are simply transparent efforts by schools to boost their USNWR rankings. The thinking goes that by allowing applicants not to report weak scores schools are able to report a higher average SAT/ACT score, thus improving their ratings on that subset of the USNWR rankings, and as a result, their overall rankings.</p>

<p>Let’s test that assumption. </p>

<p>If test-optional policies do not affect the quality of the incoming class then it’s hard to quibble with them because it means the test optional schools are right-they are able to choose a high quality class without the use of standardized test scores. But let’s assume that schools are simply trying to boost their USNWR rankings. What would we expect to suffer?</p>

<p>Certainly graduation rates. After all, if the non-submitters are less capable they should be doing more poorly in their classes. Considering that low-income and minority candidates tend to be over-represented in the ranks of non-submitters you would expect this to be a particularly stark contrast as these groups tend to have lower graduation rates to begin with.</p>

<p>USNWR weights the following factors (in percentages):</p>

<p>graduation rate 18</p>

<p>graduation rate performance (how much a school over or underperforms relative to expected grad rate) 7.5</p>

<p>freshman retention 4.5</p>

<p>total: 30 percent of overall ranking</p>

<p>One would also expect a school’s peer rankings to suffer, although perhaps to a lesser extent. The CC sages can’t be the only ones to recognize the test-optional schools’ Machiavellian scheme to boost their ratings, right? So that’s:</p>

<p>peer rating 15 </p>

<p>high school counselor 7.5</p>

<p>total 22.5 percent of overall ranking</p>

<p>Conversely, the rankings that could conceivably be boosted by allowing applicants not to submit scores:</p>

<p>SAT/ACT 8.125.</p>

<p>acceptance rate (assuming an increase in applicants) 2.25 </p>

<p>total: 10.375 percent of overall ranking</p>

<p>So in order to improve their performance on 10.375 percent of the weighted ranking we’re positing that schools are willing to risk between 30 and 52.5 percent of their ranking. Does this make sense?</p>

<p><a href=“http://www.usnews.com/education/best-colleges/articles/2013/09/09/best-colleges-ranking-criteria-and-weights”>http://www.usnews.com/education/best-colleges/articles/2013/09/09/best-colleges-ranking-criteria-and-weights&lt;/a&gt;
*Note. The table is a bit hard to read, at least on my computer, as the columns are shifted. The numbers do add up but aren’t properly aligned.</p>

<p>Aside from the USNWR rankings, we would expect to see an impact on other outcome measures, such as Ph.D. production and graduate school acceptance rates.</p>

<p>Frankly, if a school wanted to boost their rankings it would be a whole lot easier to loosen their graduation requirements. Has Bates done this? Let’s compare the GPA requirements for graduation at Bates with that for Harvard (not picking on Harvard, it’s just the first school I checked).</p>

<p>Bates
Each candidate for graduation must complete the following requirements:</p>

<ol>
<li>Either (a) thirty-four course credits, two of which must be Short Term course credits, and sixty-eight quality points. No more than two Short Term courses may be applied toward the thirty-four course credit requirements; or, (b) thirty-three course credits, three of which must be Short Term courses, and sixty-six quality points. No more than three Short Term courses may be applied toward the thirty-three course credit requirement. Option (b) is available only to students who graduate in the three-year program. The following values are used in the computation of quality points:</li>
</ol>

<p>GPA Table
A+ = 4.0 B+ = 3.3 C+ = 2.3 D+ = 1.3 F = 0 ON = 0
A = 4.0 B = 3.0 C = 2.0 D = 1.0 F# = 0 W = 0
A- = 3.7 B- = 2.7 C- = 1.7 D- = 0.7 DEF = 0 P = 2</p>

<p>Harvard
All candidates for the Bachelor of Arts or the Bachelor of Science degree must pass 16.0 full courses and receive letter grades of C– or higher in at least 10.5 of them (at least 12.0 to be eligible for a degree with honors). The only non-letter grade that counts toward the requirement of 10.5 satisfactory letter-graded courses is Satisfactory (SAT); only one full senior tutorial course graded Satisfactory may be so counted. SAT grades are given to Freshman Seminars and certain tutorial courses.</p>

<p>So it seems that Bates requires a 2.0 average, while Harvard requires that roughly 2/3 of grades be C-'s or better. Bates requires every student to complete a senior thesis and, like Harvard, has gen ed requirements, including lab sciences, writing and math.</p>

<p>But perhaps Bates is simply getting away with giving everyone A’s to make sure they pass? Not according to UC Berkeley Law.<br>
<a href=“Toughest Schools to Get an "A" - Oberlin College - College Confidential Forums”>http://talk.collegeconfidential.com/oberlin-college/934935-toughest-schools-to-get-an-a.html&lt;/a&gt;&lt;/p&gt;

<p>Bates was rated as hard a school as MIT to get an A at and harder than Stanford, Yale, Brown, Columbia, Georgetown, Emory, Wash U., Northwestern, CMC, Berkeley, Vassar, Tufts, Reed, and a host of other schools.
The study is from 1997, hardly recent, but more than 10 years after Bates went test-optional. If anyone has more recent comparative stats on grades and grade inflation at various schools I’d welcome them.</p>

<p>Fire away. ;)</p>

<p>@LAMuniv,</p>

<p>

</p>

<p>Small differences in standardized test scores are indeed meaningless, but there is a big difference in ability between a 25th percentile and 75th percentile test scorer.</p>

<p>

</p>

<p>And wealthy families can pay for a kid to take the SAT multiple times. This can particularly impact their scores at schools that superscore.</p>

<p>

And poor kids can take the test for free. The College Board gives up to 2 fee waivers to take the SAT and up to 2 fee waivers to take the SAT subject tests.
<a href=“K–12 Educators: SAT Fee Waivers – SAT Suite | College Board”>http://professionals.collegeboard.com/testing/waivers/guidelines/sat?affiliateId=rdr&bannerId=satfeewaivers&lt;/a&gt;&lt;/p&gt;

<p>For a poor kid, a high SAT score can be a big equalizer for competing against rich kids who go to $50k prep schools. Test prep books can be borrowed for free at libraries.</p>