SAT concordance table - compare old and new SAT scores

Of course CB is publishing two percentiles, “user” and “national”. Presumably the “user” is what would correspond to previous SAT percentiles, like here:
https://secure-media.collegeboard.org/digitalServices/pdf/sat/sat-percentile-ranks-crit-reading-math-writing-2014.pdf

The thing that I don’t get, for example. DD got a 760 on the new math. This was 99%(national)/98%(user). It supposedly concords to a 740 on the old SAT.

But those percentiles (from the 2014 SAT) show that even 98th percentile on the old SAT is a 770! Doesn’t make sense… So the 760 (new) should concord to a 770 (old)???

Actually, @thshadow - CB is making available three sets of percentiles. “National”, “User” and “Concorded”.

Your daughter’s 760 is exactly what my D3 got. 98th “User” but 96th “Concorded - 2015” (740 is in the 96th percentile according to that table). As the concorded numbers are supposedly based on “equi-percentiles” that 96th percentile is just as valid - probably even more valid - than the “user” percentile based on a research study.

What bugs me is that CB concorded using something other than the “User” percentiles. What percentiles did they use? Actual? And why not disclose that?

What seems to be going on with the new SAT is the same situation that Appleroth commented on regarding the PSAT. The User percentiles are “inflated” relative to the concorded values.

The important thing for selective schools is that they have a measure to show performance relative to other students who are applying. In that sense they could easily create their own percentile chart through a simple field in their database – “Score X is at the 17th percentile of scores we have received.” That would allow them to compare all applicants to each other. Also, it would be fairly easy to create their own concordance table by assigning percentiles to all of the scores they received from certain test dates in 2015 and assigning percentiles to the scores they have received from corresponding dates in 2016. The scores at matching percentiles are in concordance. The scores apparently do tend to differ at different test dates, but if you can compare apples to apples, you should be able to come up with a concordance. That would be much more accurate than what we have from the College Board.

@candjsdad that makes sense but how do we know it’s more accurate? It actually sounds like what CB claims to have done.

@candjsdad edit/addendum: actually, for colleges that get a LOT of scores up at the high end of the distribution, it definitely makes sense for them to “stretch out” that portion of the curve the way you suggest. And there is evidence that all colleges do this anyway for the Common Data Set and similar (that mid-50 range comes from somewhere!). Even if CB’s concordance tables are generally accurate, they may be less so (much less so?) at the high end.

It’ll be interesting to see how colleges choose to report this transitional year. Will they have two sets of SAT scores on the CDS? Will they concord and just report one set? I know that UChicago said at a recent info. session that it will continue to accept old scores up to the five-year expiration (or whatever the expiration date). So there might be old scores floating around out there for a few more years!

The reason you cannot validly compare score percentiles of students who took each SAT is that there can be a skew among test dates. In particular this year, many of the well-informed students took either the ACT or the old SAT just because of all the uncertainty with the new SAT. I expect that the well-informed students tended to be higher performing students.

To create a valid concordance table, you really need to compare how a student that took both the old and new SAT did on each one. Or prior to March, you would need to see how a student did on the valid part of the exam vs. the experimental part of the exam that represented the new SAT.

Now it is entirely possible that College Board screwed this up, but it is just silly to think that a typical admissions department has the skill to do this properly.

@hebegebe It would be pretty easy for colleges to test your hypothesis about about well-informed students. A school could take all the old SAT and ACT scores from this year and compare them to distributions of their applicants from previous years. If it skews higher, you are correct.

@bucketDad,

Good point. I should have thought of that.

@candjsdad - a school making up its own percentiles might not help in comparing ACT to SAT. Presumably some kids will submit both, but probably a small minority. So if you figure out that the 90th percentile of received (new) SAT scores are X, and the 90th percentile of received ACT scores is Y - you shouldn’t just assume that a student with an X on the SAT did about as well as a student with a Y on the ACT.

U of IA uses standardized test minimums for direct admission to certain colleges (Business, Engineering, Ed. . . . ) and those #'s for the revised test are definitely from the concordance tables. That information is available on their admissions website.

Spoke to another large flagship this morning and they weren’t so direct. Very general responses about using in-house peeps who are experienced with this stuff and working alongside the testing agencies . . . nothing about the revised SAT in particular. They really wanted to make it sound like the revisions were no big deal. The overwhelming number of submissions are ACT for this school so maybe it really IS no big deal to them.

@theshadow I don’t know. The real problem would be if one population was consistently lower in quality than the other. I would tend to think that the populations are similar. That may not be true if the coasts still take the SAT at much higher rates.

We went to an open house at a l over the weekend where they showed the average and middle 50% for the current incoming class. I found it interesting that they converted the old SAT numbers of the incoming class of 2016 to the rSAT numbers using the CB supplied concordance table. Also interesting was their top 25% rSAT started at 1440, which concords to a 31 (a high 31 if that is actually a thing) but their top 25% ACT score started at 32.

These were just numbers to show potential applicants, but it does not look like they are trying to concord the two test together yet. Also, this school put standardized test scores fifth on their importance list, so it may not be an issue to them.

@CaucAsianDad what were last year’s numbers for that school?

@itsgettingreal17

Freshman profile on school website:

Middle 50% SAT 1810 - 2060

Freshman profile as presented at open house:

Middle 50% SAT 1290 - 1440*

*rSAT


Straight off the CB concordance tables. I recognized the numbers because I converted them for my D.

I have been perplexed about the correspondence of my son’s June SAT scores and the concorded score. He made a 1500 with a 750cr 750m. He missed 4 questions on Reading/ 36subscore and he missed 1 on Writing/39 subscore. Though the percentile chart says his June 750 is 99%ile, it becomes a 680R and 760W on the concordance calculator. Is it really the case that missing 4 questions on Reading gives you a (old) score of 680 which is 94%ile according to the 2015 Chart? It seems missing 4 is incredibly steep to drop 6% points. Perhaps I am wrong about that…? In addition, my son has taken 2 ACT’s and made a 35 and 36 on the Reading portion. In addition, he made a 740cr out of 760cr on the PSAT last yr. I know sometimes kids just have a bad day, but with all the confusion with the SAT scores, conversions and percentiles, I can’t help but wonder if the June score of 750 is actually better than it concords. I would appreciate any thoughts or explanations. Thanks!

@ncmama34 My personal opinion is that the CB’s concordance is totally hosed. Anecdotal evidence gathered from this board (by looking at scores posted by students who took old SAT and new SAT, or ACT and new SAT) has me leaning in this direction. In the overwhelming majority of cases, it seems that students who took both tests have better scores for old SAT and ACT based on concordance. To me, this suggests the concordance is too harsh at the upper range.

Oh dear @bucketDad, I guess I need to read through this whole thread. If the concordance is not 100% accurate, the CB has put these kids in such a difficult position. Several counselors have suggested my son send is 34 ACT or 35 superscored ACT as well as the 1500 SAT. They feel like old school counselors look at scores of 750 and above as highly competitive. But if the college does an immediate concordance calculation, then this score will be a weak mark on his application. But imagine we don’t send the SAT then learn in a year or so that it actually would have been a Good score and we Withheld it. Uhgg.

I have another question. Historically at my son’s school, high scoring students usually take the SAT over the ACT. This yr it appears a huge percent of the these high scoring kids are turning in their 34s, 35s and 36 ACT scores. Is it possible that with more high scoring students turning to the ACT, that a 34 (traditionally top 1%) could be attained by more than 1% of test takers? I guess the ACT would not adjust their percentiles in time for this application period. But I wonder if it could be the case.

@ncmama34 I’d send the superscored 35. Your son’s scores are yet another example of what we are seeing…the concordance is messed up. I would like to think that the more selective colleges will see this when they analyze the distribution of new SAT scores rolling in with the applications. If I’m right, then the 1500 will look as it should. That said, send the ACT scores.

For your second question…I wouldn’t worry. Those ACT scores should have the same meaning year after year, regardless of how many students choose to submit them vs. SAT score for 2017.

You could very well be right, but realize that people who post represent a biased sample. People post much more often to complain than when everything is fine.