I’m getting the feeling that the CB is making some of this up as they go…
Seems to me that the employee(s) who put together the PSAT sample score report are probably the same ones who put together the JUNE SAT where the timing was wrong. :-S
I’ve heard that some employers are now asking prospective employees for their past SAT scores…
Is College Board one of these?
Sorry, forgot the [sarcasm] tags. Perhaps they need to be.
Does anyone know how this year’s PSAT is graded? Did we need to answer all the questions or is it like previous years?
“I think it’s interesting that complete scoring information isn’t included in the printed materials they gave the kids. Probably that wasn’t ready in time either. Maybe that is why there is so much confusion on how to calculate the scores.”
@mathyone doesn’t Kahn Academy explain how to score the PSAT? I’m not sure what all was included in the Kahn tutorials but I just went off the “Scoring your Practice Test” PDF’s from the CB website that were available for both the practice PSAT and the practice SAT’s (D3 practiced on two of those in addition to the PSAT). Scoring is pretty simple and it’s best to sit down and actually score a test to see how it works. I actually found it to be much simpler than scoring the old SAT and the cross sectional scoring really helps you understand where in a subject you might need more work.
A few people are hung up about some sample Score Report that was never included with the answer materials for the PSAT practice test. They are not even in the same section of the website fer cryin’ out loud. Maybe the confusion is that people mistook the “sample score report” with the “Scoring your PSAT Practice Test” document? Not sure.
But just to clarify - the materials WERE made available in plenty of time - it was not a rush job but something that was rolled out months ago. I’m no fan of the Collegeboard and there are PLENTY of reasons to be critical of them but this ain’t one of them. Sorry to spoil the fun, all.
There are several threads on this page with people asking about the scoring, and even some posts with people mistakenly explaining it wrong. The test booklet doesn’t include the information you need to calculate your score. You don’t find that odd? See the poster immediately prior to your post. Student doesn’t know how it’s scored.
The score report is on CB’s main page for explaining PSAT scores. https://collegereadiness.collegeboard.org/psat-nmsqt-psat-10/scores/student-score-reports
I am not “hung up on it.” I am, however, befuddled by the fact that they published an obviously flawed document. I expected that their score report would mimic the information of their released test. Creating a document which is described by them as “Explains the sections of the paper student score report. Can be used as a guide for reviewing the score report with students,” when it in no way represents their own published criteria makes the document itself rather unhelpful.
Would it not be more helpful if the documented scoring and resulting analysis of student academic ability were actually related? (Ironic, a little?? )
@mathyone you sound like you suspect a conspiracy. This will be more interesting than the discussions about test content . . .
@Mom2aphysicsgeek the only way they could really make the “Sample Score Report” relevant for each student would be to have the website include an interactive “Score Report Generator” that spits out what your student’s practice test results would look like in “Score Report” format. Not a bad idea . . .
@kaitefoxfire, if you want to know how the test is scored you should start with this page:
https://collegereadiness.collegeboard.org/psat-nmsqt-psat-10/scores/understanding-scores
I used the Wayback machine, and the Sample PSAT Score report was originally posted to the web on September 12. I don’t think 1 month and 2 days qualifies as “months”. It appears from the internal PDF document properties that the Adobe Illustrator document that the PDF came from was originally created on July 31.
The page that announced the 320-1520 score range for the PSAT has an internal HTML date of 8/27, modified 9/9. It is not archived in the Wayback machine, so I don’t know what changes were made on 9/9.
Information about scores prior to early September (back to last spring) all indicated that math and reading/writing would be 50/50 weighted for NMSF. That made me suspicious at that time that they hadn’t worked out the details, but it took until September for them to publish anything about a different weighting.
@Mamelot, I don’t suspect a conspiracy, just some incompetence.
I never expected the sample report to show accurate data, just show what ‘type’ of info would be there. That’s just me, I guess.
False. They published scoring information for their released test. They must have estimated stats for those scores. Creating tests is based on modeling. Creating a document based on their internal analysis of their released scoring data is absolutely possible. It is a model report for a student, not analysis for every student.
@2muchquan I could see that if the info on every page was different and just a sampling. But the data matches on every page except the actual score sheet.
ETA: actually, I didn’t do the math. I’m going to do that now. I just assumed the math was correct.
Ok, I had to look back and remind myself how to actually create the scaled score, but yes, even the index score matches. The only part of the report that makes no sense is the analysis based on the graded answers. (A rather humorous discrepancy. )
@Ynotgo the practice tests for SAT and PSAT were available in June. And while I read about the 50/50 scoring of the selection index from some independent blogs (prepScholar, for instance and this has been updated), I didn’t read that from CB itself. My impression was that the “experts” were guesstimating and once the actual index calculation info. came out they corrected their blogposts.
Are you saying that CB released their selection index calculation, then actually CHANGED it? That seems a bit odd . . . .
@2muchquan - Agreed! It’s silly to expect it to correspond to one particular PSAT practice test. (and this is PSAT Practice Test #1. Presumably that means their will be a #2, #3, etc. down the road). The purpose of the sample is to explain the sections not to describe the results of a particular practice test. It was never represented as such. Though I’m guessing after this conversation CB will add a disclaimer to the webpage LOL . . . .
Hey guys so based on the scoring system for the practice psat test released by cb, one wrong on math would translate to a perfect score (i.e. 47 is a 760). Here is the link to that released scale: https://collegereadiness.collegeboard.org/pdf/scoring-psat-nmsqt-practice-test-1.pdf (page 7) How accurate do you think this scale may be for the real test? I know it’s hard to predict but I’m really disappointed in myself for making a couple of silly mistakes in math, any ideas?
Also, anyone who wants to know anything and everything about PSAT scoring and the NMSQT qualifying selection index should check out this following CC thread, particularly post #31:
What is silly about it? Based on their own released materials and indexing, approx scoring can simply be generated in reverse. A converted R score of 18 means around 11 correct, W of 20 means around 16 or 17 correct, and 24 math means around 17 correct. That is based on their own indexing. Why bother giving students an approx index score if it is completely invalid?
I live in Texas and I’m just asking generally but do you think I could get national merit scholar with the following scores?
Math: -0
Writing: -1
Reading: -5
@Mom2aphysicsgeek at this point I’m guessing that it’s precisely for the reason that there are a bunch of scrupulous parents on sites like CC “guesstimating” their child’s PSAT scores and Qualifying Index (and subsequently NM potential) from a bunch of raw scores that the College Board made the sample so off-kilter. That way NO one can complain that it appeared to be even remotely representative and that somehow everyone was misled when the real test results rolled around.
But if they had shown Ima B. to be an “A” student by scoring her up in the top percentiles someone would probably complain that she’s not representative of most of the students who take the test.
You can’t win for trying, College Board. Sorry.