<p>In another example of the benefits of these boards, I just learned that while the raw score to percentile information offered on practice tests is aggregate information, actual SSAT percentiles provided to a test taker are against a much more focused population, taking into account grade level, geography and gender. This leads me to a question...</p>
<p>Has anyone ever seen data on the percentile spreads by grade level (or geography for that matter)? Seems like the data for this exists and has been gathered for some time. It may not ever have been published. For example, if rising 8th grader A gets a raw quant score of 45 and rising 10th grader B gets the same raw score, has anyone seen any actual data around what the resulting difference between percentiles tends to be? I know this would vary test to test by actual population, but am more curious if there is any generalized sense of the range of the adjustment - e.g. each grade lower tends to be about a 5% or 10% percentile uptick at the same raw score.</p>
<p>I suppose someone with kids in two different grades who took the same test might have some idea of the relative impact. I'd be curious to know if anyone had any intel...</p>
<p>I seem to recall the practice booklet provided by the SSAT included a conversion chart, so that one could convert the raw score to the percentile ranking, by grade and gender. However, the scores will vary from one sitting to another, as each iteration of the test will be slightly easier or harder. The scores are adjusted to account for differing tests. It is possible to write a perfect test, but not receive a perfect score.</p>
<p>And then, I gather that tests like the SAT (and I presume the SSAT is similar) have questions of differing difficulty. So the absolute number may not matter as much as the relevant difficulty of the questions–i.e., everyone may get 3/4 of the questions right, but the difference between test takers may be how they fare on a relative handful of questions.</p>
<p>As to the difference between grade levels, I assume teachers would have a good sense of what you’d expect an 8th grader to have mastered, vs. a 9th grader. That’s often covered by curriculum, isn’t it? But, as private schools are often at least a year ahead in curriculum subjects covered (or at least the small sample of private schools I know are–YMMV), you’d need the data from the SSAT organization to know what they’d expect a private school 8th grader to know vs. a public school 8th grader, vs. respective 9th graders. I assume the Common Core has/will throw everything up in the air.</p>
<p>@Periwinkle I understand your points about the differences in material by grade level, and the fact that different tests have differing levels of difficulty, as well as different profiles of test takers. The conversion chart you mentioned sorting by grade and gender is exactly the information I am after, but I have not yet run across it. If you know where to find the conversion table online, please post it. In reality, there isn’t a terribly important practical purpose to the question as it does not change anything in terms of how to prepare for the test, or what the outcome will be. Still, as a matter of curiosity, I am intrigued by how much of a difference grade levels in particular make.</p>
<p>I think I heard that the SSAT revamped the test, and/or scoring some time ago, so any info I would have on hand is now out of date. The best thing to do is order the test booklet from the SSAT. $35 well spent.</p>
<p>I think you are approaching this (assuming you have children who might apply) as if the people receiving the test scores are as data-oriented as you. That’s not necessarily the case. The admissions officers might be more influenced by a well-written essay than a few percentiles of difference in scores. While I believe the essay is not scored, the schools receive it. </p>
<p>I don’t think decisions come down to a percentile point either way. The interview, extracurriculars, GPA and recommendations probably count much more than whether a kid score a 95%ile or a 93%ile.</p>
<p>@Periwinkle I doubt very much that 1% (or perhaps even as much as 10%) makes any real difference with all the other factors in the consideration process. I also am completely certain that no AO would approach anything in as data-oriented a manner as this line of questioning. I was mostly curious to know whether there were very large differences in raw score to percentile conversions across grades, which at least according to the info posted above, there aren’t. As noted earlier, the scores will be what the scores will be. If anything, the benefit in having some clarity would just be in giving context to practice test percentile results.</p>
<p>One more item to nerd out on… Found an old data set from a post converting total score to percentile for help in estimating practice test results. You can use a polynomial function for the fit as follows: </p>
<p>For the original data set provided, use the ax^2+bx+c formula where </p>
<p>a = -0.000387197<br>
b = 1.826581635
c = -2054.543713</p>
<p>Results in R2 of .9995</p>
<p>Table is below where x = total score, y = percentile from original data set and fit = calculated value based on formula:</p>
<p>If anyone has a different or more recent data set correlating scores and percentiles, I’d be happy to run the new numbers. If you want to do it yourself, Excel makes short work of the problem with the Linest function…</p>
<p>If the official SSAT book is too much of a pain to order (it’s only available through them, I think), I’m pretty sure one of third party test prep books (I think we used McGraw-Hill?) has a chart where you can read -iles vs. grades for a given raw/scaled score.</p>
<p>SevenDad, I can find no statement that the SSAT organization shares data with third parties. They state on their website that they don’t release old tests.</p>
<p>@Periwinkle: I’ve since recycled the books, but do recall them having some chart of -iles per grade. Perhaps accompanied with some disclaimer. I just remember that when 7D1 took her first practice tests using one of the third party books, her score was in the low 90-iles. I don’t know how I’d remember this if I hadn’t had a raw/scaled/-iles chart to consult. </p>
<p>I think a lot of the books have percentile data in the back. It is less clear where it comes from, but in all cases I have seen, it is aggregate data not broken down by grade level, or international/domestic. The only grade level breakdowns I have seen are for the median/mean scores.</p>
<p>@blackbeard: It’s been a few years now, but I think the chart I referenced did have grade and gender…but not international/domestic. So Peri could be right in that the old charts are useless in the wake of SSAT’s change in how they do the -iles.</p>
<p>I have found SSAT to be receptive to questions, btw. I wrote to them years ago on the “is it possible to write a perfect test but not get a 2400” question and they wrote back promptly with a clear explanation.</p>
<p>“Overall, the SSAT measures appear to explain only about 15% of Choate GPA, with SSAT Quantitative as the only variable making a statistically significant contribution to this prediction.”</p>