@hebegebe I didn’t take those posts as complaints. I see lots of posts on CC asking the question “which score should I submit?” When the comparison is with the new SAT, the answer is almost always the same.
@ncmama34 that’s interesting: my D3 took the March test and missed one additional on Reading but received a 37/40 subscore. June Reading must have been more harshly curved.
@Mamelot Is there any way that the score could somehow be wrong? or is it relatively common that some tests have extremely harsh scores? My son is my oldest so I have not been through this before.
I did check an older post somewhere and it gave a pic of an Old SAT Reading section score report. There were 67 questions. You could miss 10 and get a 700. I calculated it and you could miss 15% and still get a 96%ile.
On his June 2016 SAT exam, there were 51 questions. You could miss 4, which is equivalent to missing 8%, and have a raw score of 36. This concorded to an old 680 or 94%ile.
It seems to be such a big change in addition to the fact that he has a record of scoring in the 99%ile of reading comprehension questions. It is just a bit difficult to understand.
Maybe the concordance is correct and he had a day that was not his best. But if so, wow. The test has really gotten easier if you used to miss 15% of questions and get a 96%ile and now miss 8% of questions and get a 94%ile!
I think it’s indisputable that a 34 / superscored 35 is better than a 1500 (new) SAT. There’s no reason at all to send in the 1500, IMO. (Not that I think the 1500 will hurt.)
Personally, I’m having DD take the ACT, despite having a superscored 1500 on the new SAT. If she gets a 34, she’s only going to send the ACT. With a 33, she’ll probably send both…
“I did check an older post somewhere and it gave a pic of an Old SAT Reading section score report. There were 67 questions. You could miss 10 and get a 700. I calculated it and you could miss 15% and still get a 96%ile.”
[QUOTE=""]
The old SAT is known for missing one question and dropping 50 points.
[/QUOTE]
“On his June 2016 SAT exam, there were 51 questions. You could miss 4, which is equivalent to missing 8%, and have a raw score of 36. This concorded to an old 680 or 94%ile.”
[QUOTE=""]
D3 took the March 2016 exam and missed 5 out of 52 on the Reading which is a 90%. She received a 37 which concords to a 700 on the old test. So you see right away that the unscaled score doesn't translate into the same scaled value even for the same underlying test. Various administrations of the test will differ in terms of difficulty and they adjust for this reality with using more harsh or lenient scales. As you can imagine, this gets a lot more complicated when you try to equate two very different tests!
[/QUOTE]
And that’s the key thing to remember about the rSAT. It’s a NEW test (so kids and prep experts haven’t learned the tricks yet) and it’s a DIFFERENT test which assesses different skills. Those factors combined might well explain some of the harsh concordances on the reading/writing side. But there is another factor to consider: if you look at the concordance tables pertaining to the 10-40 scales (reading and writing) you notice that a one point drop in the scaled score results in significant declines in the concorded values. Why? Probably because the current test isn’t well-constructed to account for the highest part of the distribution where you need to “stretch” out that scaling in order to distinguish the sorta high performing kids from the high performing from the very high performing to the uber high. There were several comments on the National Merit Prediction thread about this several months ago. In short, any translation at that part of the curve might give you an estimate with a lot bigger standard deviation (i.e. room for error) than at something closer to the mean/median score.
Colleges will see this if they use the concordance tables - For the Reading a lot of high performing kids will come in at 680, 700, 720 . . . .or 760, 790, 800. There won’t be any 730’s, 740’s or 750’s which is pretty weird. Writing is very similar. This is precisely why several posters are thinking that some colleges will be looking closely at these rSAT scores to see if they make sense relative to an ACT and old SAT score for a similar quality application.
In looking at percentiles, don’t even bother with the “national representative sample” (or whatever it’s called). That’s not relevant for college-bound kids who want to compare their scores to other testers. Look at the User percentiles. But keep in mind that these might well be inaccurate. They are based on research studies as opposed to real administrations of the SAT at this point (some time in the future the standard will again be previous administrations of the SAT - just for this year that’s obviously impossible).
Finally, given your son’s ACT, submitting the SAT in addition won’t hurt him in the least. I wouldn’t worry too much about this. First of all, a 1500 is still a great score overall! Second of all, his math is very strong and schools don’t usually ding you for having high math SAT/ACT’s. Icing: If he is looking at National Merit, he is most likely fine with a 1500 as a confirming score.
Edit/addendum: @ncmama34
@Mamelot Thank you so much for the time you took to explain that. It really gives me a better grasp on what is going on. He is looking at NMSF (222 SI) and he was hoping the 1500 would count as a confirming score. Thanks again!
@ncmama34 - we are hoping the same thing! My D3 might be NMSF and her SAT is also 1500.
Just came upon something interesting. UMich converted it’s students scores to new SAT range for the class of 2016, and it seems to me it was done to inflate its stats. I doubt their mid 50 this year for class of 2017 will match up. I wonder if other schools see the opportunity to do the same.
thanks @itsgettingreal17 Guessing that it’s more to assist those applying this fall in understanding where their scores lie. They provide both old and new ranges. I wish more schools did this.
If you want to get an idea of how to earn certain scaled scores on the SAT, then check out the College Board’s score conversion tables for Tests #5 and #6 (the Saturday and Sunday versions of the May SAT).
Every SAT is curved slightly differently depending on overall student performance, but Tests #5 and #6 are much more reliable than are Tests #1-4 in terms of assessing a realistic “raw score to composite score” scoring scale, since they were nationally administered tests, thus providing a great deal of raw data on test-taker performance, while Tests #1-4 were never administered to students as actual SATs, only written for the Official SAT Study Guide (2016 Edition) itself.
There is a lot of confusing new subscore mumbo-jumbo on these scoring sheets, but the most important page is page 7, where you calculate your overall composite scores in (1) Reading and Writing (200-800) and (2) Math (200-800).
Test 5: https://collegereadiness.collegeboard.org/pdf/scoring-sat-practice-test-5.pdf
Test 6: https://collegereadiness.collegeboard.org/pdf/scoring-sat-practice-test-6.pdf
One useful metric, in my experience? Grading the SAT as you would a test in high school. In other words, simply find the overall percentage of questions that you answered correctly:
1450 = 90% correct (A minus)
1500 = 95% correct (A)
1550+ = 97%+ correct (A plus)
Will colleges view a 1520 on the new SAT as equal to a 34 ACT which according to the concordance tables it equates to?
yes - I think so, @buffalo11 I believe 35 = 1550
Hi,
I was wondering if i have to pay each time i want to send my sat scores? I took the sat in march and paid to send them to a few schools. I took the test again in October and got a higher score so i want those same colleges to have that score as well. I paid to send them again but i am wondering if that was unnecessary since i paid for my last scores. Does that fee include sending subsequent scores to that same college?
You should not send score until you are finished testing @rgriff117 . You could have used score choice to select which scores to send. They charge by the number of colleges you send to, not by the scores - so you would pay the same price to send 4 SAT sittings and 8 Subject tests to 5 schools as you would to send 1 SAT and 1 Subject test to 5 schools. But the way you did it, you have to pay, because you didn’t want until testing was complete.
Also, @rgriff117 stop posting the same question in several random threads!
@rgriff117 No, you will need to send them again.
From Williams College:
Applicants to the Class of 2021 had the opportunity to submit standardized test scores for the ACT, the redesigned SAT, or the old SAT, and their test score averages are in line with previous Early Decision cohorts: ACT average of 33, old SAT averages of 731 in critical reading, 727 in math, and 725 in writing, and redesigned SAT averages of 724 in evidence based reading and writing and 720 in math.
Scores from the redesigned SAT are lower than the old SAT?
There are quite a # of colleges which have added the following note in their 2016-2017 Common Data Set
“Do convert New SAT scores (2016) to Old SAT scores
using the College Board’s concordance tools and tables (Score Comparisons – SAT Suite | College Board).”
E.G. Dartmouth (http://www.dartmouth.edu/~oir/pdfs/cds_2016-2017.pdf)
Screw the concordance table by CB - percentiles for the New SAT (https://collegereadiness.collegeboard.org/pdf/understanding-sat-scores-2016.pdf) and the Old SAT (https://secure-media.collegeboard.org/digitalServices/pdf/sat/sat-percentile-ranks-crit-reading-math-writing-2015.pdf) doesn’t even match up
E.g. A 96% percentile in Math equates to a 740 for both the New SAT and Old SAT based on the above PDFs
However, when you convert the 740 in the New SAT to the Old SAT (using the CB converter), it gives you a 710.
I agree. I think the concordance tables are off.
I am not surprised and predicted that the new scores would be lower due to:
- fewer prep materials and cram courses would be ready for the 2016 new SAT;
- There are fewer chances for multiple test taking on the new test for super scoring;
- Less opportunities for cheating through getting recycled or actual old tests early.
Hopefully the private universities will drop the concordance tables and just use their own judgement!