SAT concordance table - compare old and new SAT scores

@Mamelot I don’t know what they will do long-term with the PSAT. They’ve got themselves stuck in a difficult position by designing it to be easier than the SAT (thus the 1520 score cap). When I heard they had chopped off the highest range of scores I assumed they had decades of evidence that, even at the highest ranges, students tended to increase their scores between October of their Junior year and spring of their Junior year. It doesn’t make any sense to me that scores would change that much in such a short amount of time. If they don’t, and if the PSAT scores really are designed to mimic the score you would have gotten on that day on the SAT, then they are going to continue to have problems with kids crowding the very top of the score range. I think that is part of what happened this year, and that is why I am still nervous about my kids’ 223 SIs in DC.

I do think you are right that there was some intentional correction in the SAT after the PSAT experience. I would imagine that they will continue to make changes in the SAT too until the scores match what they distributed in the concordance spreadsheets. We won’t see any data until a year from now, at which point the scores averaged over the year will look close to what they published. I think the March - May kids had it the hardest.

@candjsdad I’m not sure what you mean…forgive me if I get this wrong. When you say “I think the March - May kids had it the hardest”, do you think that that a given score on the March test will correspond to a lower percentile on the October test? For example, if a 1500 was 99th percentile among the March testers, it would be 98th percentile for the October testers?

@bucketDad I fear that will be the case, that since they already released the concordance tables, they will try to make sure the test conforms to them. Anecdotally it seems that the March test, at least, was likely harder than the concordance tables predicted. Since the College Board will only report full-year statistics, I think they will try to adjust the test so that they can report that it lined up with the score conversion tools they released. The truth is, though, that we’ll never know. So much for transparency.

When we compare the new SAT scores to old, do we need to keep in mind that the old percentiles reflected scores that are “below” while the new ones reflect scores that are “at or below”?

Example: D3 gets a 1500. Her score converts to a 2170 (old), a 1460 (old, C+M) and a 33 ACT. The “converted” ACT score is in the 99th percentile meaning that 99% of testers scored at or below a 33 (definition is from ACT). The “converted” 2170 and 1460 are in the 98th percentile meaning that 98% of SAT testers scored lower than 2170 and 1460.

So assuming that 1) Score Converter is correct (a big IF); 2) Mapping between ACT and old SAT is accurate (always appears to have been); and 3) I am remembering how my 7th grade pre-algebra works (another big IF), then I think this means that D3 scored higher than the 98th percentile but no higher than the 99th. Is that a correct conclusion?

Just for reference, her User percentile is 98th (meaning that 98% of the User group would have scores at or below her 1500). A tad on the low side but not out of the ballpark, especially given that the Converter provides just estimates.

One would expect the Score Converter to be more accurate around the median result, rather than the high end. That’s in part what makes the above analysis so interesting to me. Obviously there is a difference between 98th and 99th percentiles, but at least I see a ballpark estimate that seems to hang together.

I’m curious to know whether this works for anyone else. Once someone confirms that this sounds reasonable, I’ll be happy to gather scores (current and converted) so that we can begin to piece together a “de facto” percentile table for the March SAT. I’d be happy to take charge of that data gathering and post what the table is beginning to look like :-B

Of course, if my analysis is just off base, please let me know that too so that I can move on to laundry or dishes or something ;))

@bucketDad @candjsdad - According to CompassPrep everyone should be looking at the percentiles falling out of the concorded scores, rather than anything reported on the current score report. (Probably makes sense given the fiasco with the PSAT percentiles). So . . . regardless of what “eventual” percentile we can arrive at for a current score of, say, 1500, it will ALWAYS concord to a 2170. And that will ALWAYS mean whatever percentile 2170 pertains to - and THAT percentage won’t change. Stick with the concordance tables - they were derived for college admissions committees so are going to be the best estimates for understanding current scores.

I’m not yet convinced scores were lower on March SAT. DS improved 100 points (to 1540) with no studying. Of course, I was also a “true believer” in the percentiles for the PSAT, so I have a bad track record!

@VryCnfsd what does your DS’s score convert to for old SAT and ACT?

ACT 34, Old SAT 2260, Old SAT CR+M 1510 if I am reading the concordance table right.

@Mamelot Do you think a 99% score on the March test could concord with a 98% score on the old SAT?

OK thanks. And sorry I actually do have the XL spreadsheets on my computer so I could have found that out myself. His old 1600 score would have been 1510, just to add to the data. All obviously within the 99th. Does he have any real previous ACT’s or SAT’s to refute the possibility of being in the 99th, or was PsAT his first standardized score?

If @VryCnsd’s son’s 1540 is truly below the 99th percentile, the score converter would need to show that because otherwise it’ll be quite a joke on college Adcoms.

@bucketDad, the best estimates for the March test percentiles are what’s coming out of the concordance tables. One can only use that info. to verify the “user group” percentiles or refute them and come up with a new table. Hope that makes sense.

Here are percentile tables from the previous SAT and ACT:

https://secure-media.collegeboard.org/digitalServices/pdf/sat/sat-percentile-ranks-composite-crit-reading-math-writing-2015.pdf

https://secure-media.collegeboard.org/digitalServices/pdf/sat/sat-percentile-ranks-composite-crit-reading-math-2015.pdf

http://blog.prepscholar.com/act-percentiles-and-score-rankings

And for extra fun:

http://www.studypoint.com/ed/sat-to-act-conversion/

DS has taken ACT and score concords with his PSAT score, but he also knew coming out that he hadn’t performed up to abilities. Recent retake will be a better gauge.

@VryCnsfd ok I can see where you are now a bit skeptical, although you also seem to think the recent retake will be higher (am I reading that correctly?). So . . . you think your son is capable of a 99th percentile or not? The concordance table is what it is and isn’t going to change. (And it’s moot in 15 months anyway so why would it?).

Yes, we’ll be surprised if ACT retake isn’t higher. If ACT comes back 33-34, which is what he expects, then that will be right in the ballpark of the concordance table. Maybe others are seeing scores that indicate the concordant ACT’s and old SAT’s should be higher, but I don’t see it for DS. They look pretty accurate, or perhaps a bit too high. It would be interesting to get more data.

Here is a data table that I made from the percentile tables I posted earlier and the Concordance Tables. I only went to the 90th percentile but would be happy to expand later on. Comments and feedback welcome!

Hopefully the titles are self-explanatory but in case not, in order: Percentile, Old SAT Score (2400 basis), Old SAT Score (1600 basis), ACT score, Concordance to New SAT (from the old 2400 score), and finally, concordance to the New SAT (from the old 1600 score).

The last two columns should - and do - replicate each other. The only column I have not included would be a conversion of Old SAT to ACT at the given percentile. When I do that it looks like the CB’s concordance is a tad low on the ACT side (33 should be 34, etc.) but maybe I goofed something up so I’ll take another look at that later.

The goal here is to map percentiles to New SAT scores.

Note: I took the LOWEST score for the given percentile (i.e. 2310 begins the 99+ range, 2220 begins the 99th, and so forth).

Keep in mind that the percentiles have slightly different meanings as I noted earlier (for ACT and new SAT, it’s “at or below”, whereas for old SAT it’s “below”).

    old      old                                       (2400 Con)     (1600 Con)

% 2400 1600 ACT New SAT New SAT

99+th 2310 1560 1560 - 1570 1570
99th 2220 1500 33 1520 - 1530 1530
98th 2160 1460 32 1490 - 1500 1500
97th 2110 1430 31 1470 1480
96th 2080 1400 1450 1450
95th 2050 1380 30 1430 - 1440 1430
94th 2020 1360 1420 1410 - 1420
93rd 1990 1340 29 1400 1400
92nd 1970 1330 1390 1390
91st 1950 1320 1380 1380
90th 1930 1310 28 1370 1370

Oh pooh the formatting isn’t very nice anymore. Well, hopefully everyone can understand it!

@Mamelot You said: “the best estimates for the March test percentiles are what’s coming out of the concordance tables”

I see what you are saying. But it’s my understanding that the concordance table is a product of the CB research study and was not influenced by the actual March test results. I’m hesitant to apply the percentiles inferred from the concordance on the March test… but I guess that’s where we are. In any case, colleges will learn the true percentiles for the new SAT as the applications start rolling in. The relative strength of a particular score should become apparent.

@bucketDad do we actually know how the concordance tables were constructed? With the PSAT there were significant differences between the concordance tables and the User Group results and it turns out that the concordance tables predicted the Commended Cutoff accurately. It’s pretty obvious that CB made some (major?) adjustments before releasing those scores.

For the SAT all we know is that these have been prepared for use by colleges during this transition year (note: this won’t be an issue for the class of 2018) and that it’s probably safe to say that colleges WILL be using them. What other method is out there for converting scores en masse? Sure, a college MAY devise some other method but that’s speculative thinking for the most part. And how can it do so w/o relying on something that may well be pretty subjective? At least with the concordance tables you are using the same tool for every applicant - and the same tool that other colleges are using. That seemed to be the purpose of the tables, regardless of how they have been constructed.

We do know that the concordance tables were developed from a research study. I posted a quote from the College Board materials on one of these threads. While I do think schools will use the concordance tables, especially to compare to the ACT, I also think they will have a better idea than we ever will what scores are coming in. Highly selective schools will notice if the scores they get from the new SAT are lower than the scores they typically expect. In the end, these exams are only a way for them to get relative ranking. If the test scores seem lower than the concordance tables would suggest, colleges will adjust relative to the population that is applying.