NYT:Revisiting the SAT:The Writing Section? Relax

<p>I've seen the SAT essay scoring guidelines that bluebayou posted, but they are so vague that I can't believe that is all that they give their graders. Surely they use some objective criteria.</p>

<p>Here's an idea:</p>

<p>How about you master what the SAT wants you to master, and then you can get a 12 on the essay and STOP complaining.</p>

<p>What a bag of sissies.</p>

<p>If UCs take the essay score out of writing, what do they end up with? How do they calculate one's total score (with the 2400 system) with the addition of a calculation of writing?</p>

<p>sparetire, as bluebayou pointed out the UCs do look at the Writing score they just don't look at the essay itself. </p>

<p>The following is taken from the UC admissions site and states the policy outlined as of 2006 regarding the new SAT and the ACT. Of course, there is the caveat that current policy is subject to change "pending further research on the predictive validity of the tests:</p>

<p>
[quote]
The UC Academic Senate’s Board of Admissions and Relations with Schools (BOARS) has recommended that, pending future research on the predictive validity of the different exams, the three components of the new SAT I and the two additional SAT II Subject Tests be weighted equally in the eligibility index. UC will use a concordance table to equate the new SAT I with the ACT Assessment plus the new ACT Writing Test. A new eligibility index is anticipated in spring 2005 for students entering in fall 2006.</p>

<p>If a student takes the ACT or SAT more than once, will the University use the highest score?
Yes. The University uses the highest scores from a single testing administration.</p>

<p>Will UC see the SAT I essay? How will campuses use the new writing score?
UC has no plans to view the essays, only to use the SAT I scores.

[/quote]
</p>

<p>12 essay
490 mc</p>

<p>"some kind of checklist that graders are given to help them arrive at a score"</p>

<p>Yes, there is a checklist. They have to address certain key points and if they don't, no matter how articulate and wonderful and 'thinking outside the box' the essay may be, they won't get the max score.</p>

<p>There is some room for subjectivity as far as quality of writing, but the checklist is the top criteria.</p>

<p>"by following the instructions to a tee. I downloaded his first SAT essay and said, huh? I didn't think it was very good, but it followed the standard 5-par format exactly,"</p>

<p>That's it exactly. They're not looking for poignancy - they are looking for following the format.</p>

<p>So if there's a checklist why don't we get to see it? The info they give the kids is pretty vague. It certainly doesn't say you need a five paragraph essay.</p>

<p>Wow, do colleges really question a student's integrity if his/her well-written college essay is accompanied by a low grade on the SAT essay? I got an 8 on the SAT essay and planned to "prove" to colleges that I am a good writer. All three English teachers who read my college essay were impressed. Hopefully, the admission committee will look beyond my essay score.</p>

<p>I haven't read this whole thread, but...my daughter got 800s on the writing both times she took the SAT 1. She thought it was far and away the easiest part of the test.
She has always been an excellent writer, with good grammar, etc., but where she shines is that she is a very quick, creative and clever writer. Her essays were not just clear and well written - they were highly amusing. I am sorry colleges didn't take this score into consideration because it really highlighted her strength. Her CR was excellent, too. Her math not as good - all three scores were a real reflection of her abilities as a student (mathis her weakest subject). Anyway, she got into her first choice, but I do wonder if some of the people saying how "terrible" the writing test is just are not naturally good creative writers, just as my daughter is not naturally a great math student. English was always easy for her, for example, but she struggled in calculus.</p>

<p>Got the visa versa here: DS got 800 on math SAT 1 both times he took it; Cr 690 first time, 680 second time. This is exactly what one would predict given his grades and test scores all the way from when he was a little guy. He doesn't like to "read critically", doesn't read much for pleasure, and math comes easy.... He writes well, though, and got a 750 on the Writing - which only one of his schools will even look at. The SAT has been a very accurate measure of my son's strengths, although I know this is not the case for all kids.</p>

<p>My understanding of the new writing component of the SAT is that it's nearly identical to the old Writing SATII. That's why the Writing SATII was eliminated when the new SAT was implemented. So why is it suddenly now an unreliable test? (agree with Bethievt.)</p>

<p>I would assume that the colleges that used to require the old Writing SAT II would now be using the writing component in the same way. For the vast majority of colleges that didn't use it, the writing component provides one other way to show an applicant's strength, as in the case of Catherine's student.</p>

<p>It seems to me, that the Writing component of the SAT provides a better idea of the writing ability of a college applicant than an over-edited college essay, especially in this day and age of highly paid college consultants and over-involved parents. (And I should know, being one of the latter, LOL.)</p>

<p>What does this say about me:</p>

<p>550 CR, 690/12 W
600 CR, 640/12 W?</p>

<p>I hope colleges see the essay we produced, even though I kind of doubt it.</p>

<p>:)</p>

<p>"So if there's a checklist why don't we get to see it?"</p>

<p>I work for a testing company and that is the standard way to grade open-ended items (questions other than multiple choice). There is a checklist pertaining to the specific question, with certain points that need to be covered. If some of the points are covered, that gets a certain score. I am assuming that the SAT essay is graded the same way, as it seems to be the industry standard.</p>

<p>glealdragon, according to all information available through the CB (including the educator guidebook "ScoreWrite"), as well as advice given by one would hope savvy essay readers, there is no checklist - other than the published rubric - used to grade the SAT writing section essay. OTOH, there are a plethora of prep books and websites out there that provide standard checklists for students who feel more comfortable with that type of test preparation. </p>

<p>On the SAT writing section:</p>

<p>
[quote]
The multiple-choice section will test grammar and usage. The questions will include three types: identifying sentence errors; improving sentences; and improving paragraphs. It will take 35 minutes and comprise about 70 percent of the writing score. Some educators liken this part of the writing section to an editing test that demonstrates an individual's ability to rewrite poorly written English.</p>

<p>Then the essay:</p>

<p>The essay, which will comprise about 30 percent of the writing score, will put far less emphasis on grammar and mechanics, and more on the student's ability to develop a point of view on an issue with appropriate reasons and examples. It will be similar to the kind of on-demand writing that is typically done in college classes and is read as a first draft. The student will be given 25 minutes to respond to an essay topic, also known as a prompt. The topic will be general enough to respond to without requiring advanced knowledge on any specific subject, but specific enough to make "canned" preparation impossible...</p>

<p>The best advice to students I have ever heard on the subject of the new SAT essay comes from Illinois high school English teacher Bernard Phelan, who was also a College Board Trustee, and is an SAT test developer and essay reader: "You determine the actual content of the essay. Skip any and all advice about the formulaic. Readers of your essay are not working with a checklist of traits, or a rubric. They are grading the essays holistically, that is, taking into account all aspects of the composition process together, rather than singling out traits in a checklist."

[/quote]
</p>

<p><a href="http://www.cobbk12.org/schools/hs/schcampbellhs/Announcements/new_sat_test.htm%5B/url%5D"&gt;http://www.cobbk12.org/schools/hs/schcampbellhs/Announcements/new_sat_test.htm&lt;/a&gt;&lt;/p>

<p>mathmom, you are right (and I would take a look at tokenadult's posts on the K-mart calculator thread) there is nothing in the rubric - that Bluebayou posted above - to indicate that the 5-paragraph essay format is "the" one and only way to go.</p>

<p>The following NPR piece on "Grading the New SAT Essays" takes about 12 minutes and is worth a listen. It is an interview with Noreen Duncan, English professor, Mercer County Community College in West Windsor, N.J. She was a reader of College Board exam essays for 15 years and helped develop the SAT essay grading rubric.</p>

<p><a href="http://www.npr.org/templates/story/story.php?storyId=4529987%5B/url%5D"&gt;http://www.npr.org/templates/story/story.php?storyId=4529987&lt;/a&gt;&lt;/p>

<p>librarymom: I do agree that for all intents and purposes, the new SAT writing section is basically a re-working of the old SAT writing subject test (which of course, is no longer administered for that reason). The old test was an hour long with a 20 minute essay, the new test is divided into two sections - the mc section is 35 minutes and the essay 25 minutes. The CB advised students to take the writing subject test, as well as other language subject tests, strategically meaning that the test was most likely taken early in the senior year by those students who were applying to elite, highly selective schools. So, one of the biggest changes is simply the fact that the writing test is no longer optional and is no longer aimed at those students whom we may assume were confident that they had a significant degree of competence and even mastery of written English. That most colleges will not take the time to read all the essays is not surprising. The basic message of the OP article is to relax because many colleges are not using the new writing section score as a "make or break" point. This does not necessarily signifiy that these colleges view the new writing section to be unreliable but that it is just that - new. The SAT essay will generate a great deal of data for the CB and colleges to deal with (or not). All the same, a healthy dose of skepticism regarding the impact of the test this early on is not out of order.</p>

<p>The following quote is taken from "Taking Issue "SAT: Writing, Yes; Test Essay" by Bruce J. Poch written in March, 2005.</p>

<p>
[quote]
It would be unfortunate indeed if this test should prove to be a disincentive for the nation's most selective and financially well-endowed institutions to enroll those students who are most underrepresented -- students from lower-income families.</p>

<p>At Pomona, we plan to do a couple of things with the new test. One will be to download the actual essay so that we may read it with a mind to applying our own standards to the work. Only after that reading will we look at the score assigned by the College Board. If their view is consistent with ours, confidence will be built. If not… well, I may write another editorial.</p>

<p>We also will have the chance to lay it side by side with the presumably more polished application essay. We expect there should be a similar voice. Where there is no apparent connection, we may question the ultimate authorship of the application essays.</p>

<p>For all my concerns about the rollout of the new exam, in the end, I am glad writing is being encouraged. If schools take this as a carrot to work with students to develop critical writing skills, the new SAT will be a success. But if it turns out to be a stick to further beat down underrepresented students, and those from underfunded, overpopulated schools, it will be a serious problem.

[/quote]
</p>

<p><a href="http://www.npr.org/templates/story/story.php?storyId=4527164%5B/url%5D"&gt;http://www.npr.org/templates/story/story.php?storyId=4527164&lt;/a&gt;&lt;/p>

<p>anxiousmom: "The SAT has been a very accurate measure of my son's strengths, although I know this is not the case for all kids." Kudos to you because you and your son (and, deep down I hope, the SAT- ETS) must be doing something right.</p>

<p>I came across the following January 2005 article which takes a look at a team of English professors and psychometricians as they labor over sample essays "to determine what kind of writing should be rewarded and what penalized." :</p>

<p>"In an attempt to demystify the process, the College Board allowed a reporter to sit in on a pilot scoring session last week that is like the ones that will be used to train thousands of test scorers around the country. The reporter agreed not to divulge questions that could be used in future tests but was otherwise free to describe what took place at the meeting.</p>

<p>The behind-the-scenes look at the making of the new SAT suggests that there is no single formula for achieving a high score on the writing portion of the test, and that formulaic writing can result in a lower score. At the same time, it is legitimate to wonder whether the eccentric spark of genius will be rewarded when thousands of test-graders across the country try to implement the guidelines established by the experts.</p>

<p>Unlike traditional multiple-choice questions, some of which will also be on the writing portion of the SAT and are scored by computer, the new essay portion represents a logistical challenge akin to a military operation. Each essay will be scanned into computers and read by at least two scorers. A force of 3,000 scorers, mainly moonlighting teachers, is being deployed at 15 regional centers. Scorers must read an average of 220 essays in eight to 10 hours.</p>

<p>Scoring sessions</p>

<p>The preparatory range-finding session is a cross between a debate among art critics on public television and the judging of an ice-skating competition. Bursts of passionate discussion are followed by the grading of the essays, with scores from 0, for a blank sheet of paper or an essay that has nothing to do with the topic, to 6. If two scorers differ by more than one point, a supervisor is summoned to adjudicate. The cumulative score can fall anywhere between 0 and 12.</p>

<p>To guide scorers, the team has already approved a sample set of answers to a question about the benefits and drawbacks of secrecy. The "prompt," as an essay question is called in education parlance, consists of two quotations, one justifying secrecy as an indispensable part of human life, the other attacking it. Students are then asked to develop a point of view on secrecy, with examples to support their argument.</p>

<p>An essay that does little more than restate the question gets a 1. An essay that compares humans to squirrels — if a squirrel told other squirrels about its food store, it would die, therefore secrecy is necessary for survival — merits a 5. Brian Bremen, an English professor at the University of Texas, Austin, notes that the writer provides only one real example. Nevertheless, he says, the writer displays "a clear chain of thought" and should be rewarded.</p>

<p>The panel overlooks a few grammatical errors and misspelled words. "F. Scott Fitzgerald once handed in a manuscript with seven consecutive misspelled words," Bremen says. "If you can write like F. Scott Fitzgerald, you will be OK."</p>

<p>"We rewarded [the squirrel paper] because it was unique, and the student came up with it in 25 minutes," says Noreen Duncan, who teaches African-American literature at Mercer County College in New Jersey. Some mistakes are permissible, she says, because anything that can be written in that time is, by definition, "a first draft."</p>

<p>"Holistic scoring"</p>

<p>The team uses a technique known as "holistic scoring," a euphemism for reading an essay very quickly (a minute or so per paper) and making a snap judgment. This is not like grading a school essay, in which points are deducted for uncapitalized letters or an insufficient number of paragraphs. The scoring technique puts a premium on a student's ability to develop a logical chain of reasoning over the mechanics of writing.</p>

<p>Most scorers end up within a point of each other on most essays. The discussions at the range-finding sessions are designed to establish guidelines for dealing with the difficult-to-categorize essays, many of which will probably be kicked up to a supervisor.</p>

<p>At the sample-scoring session, an Illinois high school teacher named Bernard Phelan is keeping everyone entertained with his pointed comments. "This essay has the ring of empty assertion," he says of one effort, which ends up with a 2. "The student is telling us, 'I don't have an example, and I'm not about to provide one any time soon.' "</p>

<p>He awards another essay a 5. "Some kids write but don't think," he explains. "This kid thinks as she writes. There is some awkwardness here, but she moves fluently from topic to topic."</p>

<p>By 5 p.m., after eight hours of scoring essays, the eyes of the 13 panelists have begun to assume a glazed look. "The lighting in here is terrible," Vickers suddenly notices.</p>

<p>The panelists decide to tackle one final essay, which has received scores ranging from 2 (seriously limited) to 5 (reasonably consistent mastery). The essay is two pages long and is sprinkled with academic-sounding words such as "commodity" and "value."</p>

<p>Ed Hardin, an expert with the College Board, makes a stab at reading the essay out loud. He had awarded it a 5 on the basis of his first impression and the sophisticated vocabulary but changes his mind as he tries to make sense of the stilted prose.</p>

<p>"Somebody is going to have to buy me a drink," he groans halfway through the reading."</p>

<p><a href="http://seattletimes.nwsource.com/html/nationworld/2002153350_sat18.html%5B/url%5D"&gt;http://seattletimes.nwsource.com/html/nationworld/2002153350_sat18.html&lt;/a&gt;&lt;/p>

<p>I think the biggest problem with the SAT writing is consistency. My daughter received a 12 on the essay her first time out, while she got only a 9 on the second try. Her multiple choice scores on the writing section remained consistent, differing by (if I remember correctly) only 20 points.</p>

<p>I've read some of the threads on CC questioning possible scoring errors on the October SAT writing section due to this kind of unexplained drop. I think it's just a different set of graders, for a different question. Even with a scoring rubric, there's still room for interpretation and therefore subjectivity. As the poster above indicates, <em>when</em> the essay comes up for review may also be a factor, as the graders cannot help but suffer from mental fatigue after reading essay after essay on the same topic. </p>

<p>Maybe the old SAT II Writing test had the same problems, or maybe, because fewer student took it, the graders were not quite as numbed by it all.</p>

<p>momwaitingfornew, you make an exceptionally good point about the consistency of scores and quality control. I wonder just how much a different prompt can elicit a different approach, perhaps a different type of introduction or conclusion and how subtle and even less subtle changes might influence the "holistic" impression and score. This is interesting since now more students will take and then re-take the SAT whether as sophomores, juniors or seniors- after all, most students took the old subject test only once (as seniors) and then were done with it. </p>

<p>In any case, several of the previous posts about the scoring rubric and the existence of writing "guidelines" do make me wonder just how the new SAT writing section will put pressure on schools to teach writing and what seems to be the current buzzword "content literacy" (the "Reluctant R").</p>

<p>
[quote]
Finding the Formula</p>

<p>While experts generally agree that the writing exercise will raise the profile for writing instruction in the high school curriculum, some worry that it will lead more teachers to adapt or reduce their instruction to a formula for success on the exams.</p>

<p>“If more writing of any kind is better than not writing, then the changes are a good thing,” said Randy Bomer, the president of the National Council of Teachers of English, which is based in Urbana, Ill. “The danger, of course, is that, as with most tests, people will come to believe rightly or wrongly that there is a single formula for getting a top score” and spend the bulk of their instruction on that genre.</p>

<p>Such a formula is likely to tend toward the “five-paragraph essay” favored by many state writing tests. Such essays include an introductory paragraph with a topical or thesis statement, three supporting paragraphs, and a concluding statement.</p>

<p>That format, contends Mr. Bomer, an assistant professor of language and literacy studies at the University of Texas at Austin, fails to foster creativity or innovation in writing.</p>

<p>But College Board officials say that teaching structured writing can help students become more creative writers.</p>

<p>“To teach a certain kind of structured writing can be true instruction,” said Chiara Coletti, a spokeswoman for the New York City-based board. “If you look at the scoring guide for the new SAT, you see that style is rewarded, logic is rewarded.”</p>

<p>While Ms. Coletti says that students cannot necessarily prepare for the writing portion of the test, teachers are building exercises into their lessons to familiarize students with the format and help them complete the task within the time constraints.

[/quote]
</p>

<p><a href="http://www.edweek.org/ew/articles/2005/02/02/21satwriting.h24.html%5B/url%5D"&gt;http://www.edweek.org/ew/articles/2005/02/02/21satwriting.h24.html&lt;/a&gt;&lt;/p>

<p>Well, I'm concerned by someone's comment (I think Calmom?) that CB's aim was that the writing score would be more closely aligned with the CR and since girls do better on CR and writing mechanics, they should score higher on essays than boys do. Maybe I've misinterpreted, but that sounds like tail wagging dog. Just because someone is good at reading critically (as CB measures it) and writing mechanics, this does not mean said person will write well. Grammatically correctly, but not necessarily more than that. So why gear a test or the scoring of it to come out with the results you want? Isn't testing supposed to be about finding what is true rather than deciding ahead of time what should be true and then manipulating the test or scoring to come up with the desired result?</p>