NYT: Tripped up by New Writing Section on SAT

<p>Xiggi notes, "Also, you are absolutely incorrect that the test should NOT be about who can think "quickest." The test is ALL about reasoning under time constraints. Extending the 30 minutes to answer a section of 25 questions to 35, 40, 45 minutes would simply render the test trivial, and make it an absolute worthless measurement for students who do not suffer from a handicap"</p>

<p>Response: Ok Xiggi, lets not call it the SAT. Lets call it the fairer better predictor test (FBPT). Do you like that name better? Tests that require quick reactions of mind or body, in my opinion, do not correlate well with predicability towards college success. I have seen the College Board statistics with SAT vs. GPA corellation and usually the corellation is somewhere between .5 and .6, which isn't much of a corellation. Check out the College Board Stats for yourself.</p>

<p>Taxguy, feel free to reinvent the wheel ... or start a debate about what the SAT should be or not be. All I know is that the SAT is still the standard, and until someone develops a better mousetrap, we'll have to deal with it. For the record, regarding your request for more time, please note that the only competition that has sprung up to "improve" the SAT is the ACT. Were you to research the mechanics of both tests, you'll discover that TIME is an even LARGER constraint on the ACT. The questions are more basic, but you have less time to answer them. </p>

<p>Like it or not, the tests are CALIBRATED according to several elements and adequately balance difficulty and time. You simply cannot change one without having an effect onto the other. </p>

<p>As far as the correlations, why don't we let the various factions draw their own conclusions. On one side, you have the "experts" at organization such as Fairtest and the schools that have decided to drop the SAT or ACT. On the other side, you have the schools that continue to use the tests as a COMPONENT of the admisssion decisions. The numbers speak for themselves. </p>

<p>In the meantime, there is not much--if anything--you or I can do, except trying to find ways to beat the test by maximizing the possible scores. There are plenty of ways to improve the quickness of thoughtful students, and the improvement will most definitely pay dividends in college; something that is far more important that the SAT in isolation.</p>

<p>
[quote]
All I know is that the SAT is still the standard, and until someone develops a better mousetrap, we'll have to deal with it.

[/quote]
</p>

<p>ziggi, this thread started (oh so long ago) re the WRITING component of the SAT, which as you know is brand new and experimental. As some have pointed out, most colleges are taking a wait-and-see attitude. One complaint I've heard over and over from faculty and admins is that a timed writing test is irrelevant in the context of their curriculum. Professors of English in particular encourage prewriting, writing, reader input, reflection, and rewriting -- and regard the 25-minute SAT writing component as pretty absurd. Its only redeeming virtue is that it provides a proctored (okay, I think the British "moderated" sounds less like an embarrassing physical exam, but I bow to you Yanks) writing sample with all its faults.</p>

<p>No way can you legitimately call this writing component
"standard."</p>

<p>Perhaps a college might look at the writing section as exactly what it is- a 25 minute sample of writing. Maybe that’s meaningful for some, for others, not. Colleges can do what they want with that data. What I’d like to know is how they view an outstanding application essay from a student who gets, say, two 3’s on his writing sample, a 480 on the overall writing test. Wouldn’t something like that cast suspicion?</p>

<p>
[quote]
What I’d like to know is how they view an outstanding application essay from a student who gets, say, two 3’s on his writing sample, a 480 on the overall writing test. Wouldn’t something like that cast suspicion?

[/quote]
</p>

<p>Probably. I'd hope in that case the colleges would look deeper, at GPA, perhaps a graded essay from an English class. It is perfectly possible for a fine writer to flub a timed essay test, though I'd wonder about such a low score on the multiple-choice grammar section.</p>

<p>It happens that my D did well on the writing component of the SAT (740) and that she is also an excellent and thoughtful writer (IMHO, but I have been a JC writing teacher so I've seen the spectrum). The two top colleges that accepted her had both requested a HS graded essay, and both said they had read her timed SAT essay -- not just looked at her scores. Both colleges have expressed doubts about the value of the SAT Writing component, but look at it as an extra bit of info.</p>

<p>Like many others, I'm leery of placing too much weight on the SAT. In the UC system, which is very numbers-driven, we do give it far too much weight. I have no problem with privates considering SAT scores as one small part of the picture.</p>

<p>The instructions to readers, to look for "clear and consistent mastery" in areas like critical thinking while ignoring misspellings and other writing flaws encourage subjective grading. They also build on what many see as perceived weaknesses of the current approach to writing, which de-emphasizes attention to basics in favor of self-expression. </p>

<p>I hope colleges do get to read the essays. They may have totally different reactions to them; I certainly reacted differently to the three supposedly exemplary excerpts published with the article.
I'd like to know how one gets from scientists to Andrew Lloyd Weber...</p>

<p>Zelloguy, there is no need to play games of semantics. For the record, I am no fan of the writing component of the new SAT, and this for a number of reasons. Does it make a difference to label the SAT Writing a standardized test or the "new" standard? In the meantime, the SAT remains the standard, as schools decide how to best utilize the new section. On this issue, it is best to send the expression of disappointment for this boondoggle to the "genius" at the UC system, who was magistrally outfoxed by a very smart Gaston Caperton. </p>

<p>Lastly, it is obvious that I addressed the issue of time on the NON-WRITING sections of the SAT. Again, for the essay, it makes no difference if more time would be given ... the expected quality of the writing would simply be elevated, and the grading would be tougher. The points I made about the other sections remain valid.</p>

<p>
[quote]
Again, for the essay, it makes no difference if more time would be given ... the expected quality of the writing would simply be elevated, and the grading would be tougher.

[/quote]

You may be right that the grading would be tougher. I just cringe at the thought that the excerpts published in the NYT are considered worthy of a perfect score.</p>

<p>Ziggi, games of semantics aren't my style. What you call "standard" and I call "experimental" nicely illustrates our two different approaches to the problem. You accept the SAT as a given and are concerned with how to "beat it," as a student/consumer, right? Very practical of you. When my 9th-grader's turn comes, I'll likely shift my focus too. But right now I'm taking the English Dept. POV, assessing the validity and usefulness of the Writing component in particular. I've heard/read faculty opinions. Here on LJ I'm listening carefully to parents and students.</p>

<p>Marite, while the essays that obtained a perfect might make you cringe, the essays that earn an average score tell an even more disturbing story. On story, alas, that simply confirms that the average high schooler is incapable to write an intelligible essay, let alone a cogent one. One could imagine the disaster that a tougher grading would create. The correct assessment of the average writer gradutaing from high school was obscured when the test was a Subject Test, as the students who took the test were a self-electing cohort. Now, that most everyone has to take a writing test, there is no escaping nor hiding. </p>

<p>As unpleasant as it may seem, the fact that students do not write enough in high school--and that their writing is not graded enough--is the main cause of the problem highlighted by the SAT Writing. That, and the insistence by many, many parents and teachers to excuse this poor performance. </p>

<p>There aren't enough John Stossel's to point accusatory fingers! :D</p>

<p>"I just cringe at the thought that the excerpts published in the NYT are considered worthy of a perfect score."</p>

<p>From the NYT:
[quote]
Last week, when the board released 20 top-scoring essays, all on the topic of whether memories are a help or a hindrance, it was impossible not to notice that many were — what’s the right word? — awkward:</p>

<p>“Memory is often the deciding factor between humans and animals,” one started.</p>

<p>“It is a commonly cited and often clichéd adage that people learn from their mistakes,” wrote another.</p>

<p>“We reason only with information, that is, reason is the mortar that arranges & connects pieces of information into the palace of understanding,” said a third.

[/quote]
</p>

<p>Cringing here, too. My DC, whose writing has been published and recognized with several awards, got a 10 essay score (and still an 800 on the Writing part of the SAT). DC's essay had a sophisticated analysis for a two page essay and was not awkward. By rewarding awkward or overblown prose with the highest scores, the CB is not advancing the cause of clear, effective writing and critical thinking.</p>

<p>Again, you may be right about the poor quality of the other essays. I worry, however, about the emphasis on critical thinking at the expense proper spelling, punctuation, and other basic aspects of writing, especially since critical thinking is not the predominant impression the excerpts convey. Most of all, I worry about what message these supposedly perfect essays convey to prospective essay writers.</p>

<p>Zelloguy, we have debated the validity of the writing section ad nauseam, and I have posted my opinions about it many, many times. Also, one can be pragmatic about when facing a new situation, and still research it with great concern. One can pay close attention to the writing of Les Perelman and read the ASU listserv to gauge the opinions of the english department members, and simply evaluate what needs to be done to earn a good score on the new test, including learning in great details how the test is scored. </p>

<p>One can also evaluate the depth of the chasm between what a high school believes good writing should be and what the faculty in college think it should be. The fact that anyone might expect a 25 minutes exercise in futility to bridge that gap is simply mind-boggling. Just as the other components of the SAT, the test helps ascertain the symptoms that plague our education, but was never meant to offer solutions.</p>

<p>
[quote]
Marite, while the essays that obtained a perfect might make you cringe, the essays that earn an average score tell an even more disturbing story.

[/quote]
</p>

<p>My contention is that some of the essays that earn less than perfect scores are in fact better essays, and that the scoring is shaping students toward poorer writing.</p>

<p>
[quote]
I worry, however, about the emphasis on critical thinking at the expense proper spelling, punctuation, and other basic aspects of writing, especially since critical thinking is not the predominant impression the excerpts convey. Most of all, I worry about what message these supposedly perfect essays convey to prospective essay writers.

[/quote]
</p>

<p>Marite, this is what I wrote in post number 20:</p>

<p>"This is not very different from spelling, which I also believe should be part of the grading process. In my opinion, a handwritten test should be graded on all its components, and that should include spelling, presentation, correct argumentation, as well as ... the mechanics of writing itself."</p>

<p>Unfortunately, I believe that the SAT test writers had two options: the first one to develop a test based on the expected level in colleges, and the second one mostly based on what happens in high schools. It is obvious that they opted for the high school version. The fact that so few perfect scores were earned is a testament to the growing lack of interest for the subject in our high schools, as well as the level of competence of our teachers.</p>

<p>Ziggi, I agree with everything you just said -- edited to add everything you said in #53. No argument here. Your response sounds argumentative and I'm not sure why. I think we're all in favour of improving the teaching of writing in high school (and grade school for that matter). As a writing teacher and editor myself, I've always read my children's writing and tried to provide helpful feedback when asked for it. The three oldest are all excellent writers, but my youngest (just entering boarding school) is the most independent and has always resisted any help from mom. He has far to go, and his lack of skill in writing (despite phenomenal standardized test scores) is the main reason he is not attending our public high school. His new private school English teacher is a graduate of Bread Loaf and a published writer, as well as an experienced high school teacher, so I have great hopes.</p>

<p>I haven't read the whole thread but I am dubious about the basic premise that you can measure a student's quality of writing in a relatively brief, one-shot test scored against some standardized rubric. An instance where pursuit of "fairness" in multiple dimensions leads to worthless results, imo.</p>

<p>Xiggi:</p>

<p>I don't know what the readers' instructions are, but the message these excerpts send is wrong. It is likely to reinforce the perceived weaknesses of the high school curricula.</p>

<p>I agree with replies 50 and 54, but continue to believe that writing quality can be determined in the following way:</p>

<p>CB instructs Engl. teachers to keep some "best" copies of <em>in-class</em> student essays for each student. (Could be one or more examples during the academic year.) Upon request -- or at a scheduled time -- teacher sends photocopy of the best example of writing directly to CB, who files said copy for sending to colleges upon college/applicant request.</p>

<p>Teacher & student could perhaps consult with each other ahead of time & agree which essay to consider the official sample.</p>

<p>
[quote]
CB instructs Engl. teachers to keep some "best" copies of <em>in-class</em> student essays for each student. (Could be one or more examples during the academic year.) Upon request -- or at a scheduled time -- teacher sends photocopy of the best example of writing directly to CB, who files said copy for sending to colleges upon college/applicant request.

[/quote]
</p>

<p>"instructs"? Whoah! The College Board doesn't <em>instruct</em> anybody to do anything. It's a corporation, not a government agency :)</p>

<p>I guess the point of the SAT Writing component is to enforce standardization. Theoretically, everybody produces an essay under similarly ghastly conditions.</p>

<p>The current practice of submitting a graded HS writing sample doesn't meet that criterion, but seems good enough to be useful. It worked well for us. Yes, some students might submit work that had been heavily edited; that's a clear drawback. But college admissions people generally have good access to students' English teachers (most write recs), so it would be risky to cheat.</p>

<p>One drawback of your suggestion (aside from politics) is that in-class essays happen under very different conditions at different schools, so attempts at standardization might not be too successful. English teachers are a rebellious sort you know. The advantages, of course, are that more time could be allotted than the 25-minute SAT and students could write many essays and choose their best.</p>

<p>I'm curious to see how admins would respond to your suggestion.</p>