Does Anomalous Standardized Scoring Information Indicate a Significant Lack of Diligence by U.S. News?

If it breaks down that way in practice, my guess is that it’s the latter. If, in fact, US News gets their data through school participation (rather than what at times we suppose is a US News data gatherer plucking numbers from wherever they can find them), I would guess Wesleyan would report the numbers of the people admitted and enrolled who submitted scores for their admission decision, and not the other scores. If one wanted to imagine a nefarious intent on their part for reporting the lower aggregate range in their CDS reports, it might be to encourage more kids to apply. The optimistic supposition would be that they do it for transparency purposes. Who knows.

But if Wes and Bowdoin are lowering their score ranges for rankings and CDS purposes, it would seem something ought to be done about that so that the ranking inputs are standardized. If the goal of any ranking is at least in part to assess “who goes to school and where,” it would seem to make sense for the rankings services to require schools to report all of their scores as Wes and Bowdoin do, and to provide a % of students who submitted scores for admissions purposes.

In any event, there were several other changes to the ranking methodology that plausibly could have factored into Wesleyan’s rise from 17 or whatever it has been to where it is ranked now.

You can see an example questionnaire USNWR sends to colleges at https://oir.uga.edu/_resources/files/usnews/2022USNews.pdf . USNWR requests scores for “enrolled, first-time, first-year (freshman) degree-seeking students – full, or part-time-- who submitted test scores.” USNWR only requests scores for enrolled students – not admitted students. They also ask, “What percent of first-time, first-year (freshman) degree-seeking students who enrolled submitted SAT scores?”

As you noted, colleges have a different motivations for choosing which enrolled student scores to include for USNWR questionnaire and CDS . For highest ranking, it’s advantageous to have the highest possible reported score for USNWR, so it’s preferable to not include scores for test optional matriculating students who submitted scores in summer before attending at request of colleges (for statistical and class placement purposes). However, it may be advantageous to include these students for the CDS listing since having extremely high 25th percentile scores can discourage kids from applying.

However, for Wesleyan and Bowdoin, it may more be for historical reasons. Wesleyan and Bowdoin were test optional prior to COVID, when nearly all attending students had taken SAT/ACT at some point, but test optional kids chose not to submit. Prior to COVID, most peers were test required. When test required peers are reporting scores for ~100% of students, it’s more meaningful for test optional colleges to also report scores for ~100% of students in CDS, rather than artificially boost scores by only reporting test submitters. Wesleyan and Bowdoin started reporting scores for test optional kids (when available) prior to COVID, and did not change this policy after COVID, even though no peers were still test required.

That works well in a test required system. It does not work well in a test optional system. Many enrolled students do not submit scores, with different portions of kids submitting scores at different colleges. Most colleges do not request that these test optional matriculating students submit score in summer before attending. Bowdoin and Wesleyan are exceptions, rather than the rule. Even if colleges do request scores for test optional kids, a good portion have never taken SAT/ACT, so they don’t have anything to submit.

Very helpful. I didn’t realize the questionnaire was available. With that much specificity, it’s hard to imagine that Wesleyan did anything other than submit scores according to the stated parameters. I can’t fathom a Wes staffer saying, “Not entirely sure what ‘enrolled’ means. Let’s provide them the ‘admitted’ numbers and call it a day.” So it seems that the culprit is both Wes’s practice of including total pop. enrolled numbers in its CDS and yet other cuts at the data on its website, certainly lending to the confusion if not causing it. As you point out, that Wes and Bowdoin are outliers in their CDS practices doesn’t help the situation at all. Amplifying here, though, what @circuitrider wrote: nobody bothered to look at Bowdoin … until now that is.

Totally agreed. It’s a mess. It was more of an aspirational thought on my part … if this TO world is to continue, then we need some way of lining everybody up so that good comparisons can be made if we are to include in those comparisons test scores . You have varying percentages among schools of kids who are admitted w/o scores, which is flaw #1. Then you have two schools who typically show up in the T20 (and maybe others) who burden their CDS enrolled numbers with scores of students who purposefully and strategically did not include them in their admissions application. Flaw #2. And now, as you point out, with all of these new TO schools and the lack of clarity to applicants that they ever have to take a standardized test (i.e., planning to go TO from the get go), requiring all scores (whether used in admission or not) will still get you a mixed bag. Flaw #3.

I’m not sure how you get to that conclusion with a pretty big piece of the puzzle missing. The data questionnaire US News sends to schools is fairly specific about which data to submit. That Wes adds other data presentations on its website doesn’t necessarily mean they submitted that other data to US News. I think the only way we could know that they did would be to see the submission. I for one find it hard to believe that Wes would confuse admitted vs. enrolled, particularly since their CDS reports show they are more than aware of the difference. Of course if they had done this with intent, then it would not only be cheating, but unbelievable stupid cheating since they serve up admitted #s on their website and total enrolled scores on its CDS.

This got lost somewhere upthread, but it seems that the school profile - the bundle of information you get when you click on the school from the list - includes test score information from some other data set … admitted, or enrolled sans those scores not used for admissions decisions. Either way, it’s possible that those profile scores are not the scores used in the ranking, which I presume are sourced from the questionnaire.

Don’t mind me, I’m just a voice crying in the wilderness.

Touche!

1 Like

To me, the real question is why would USNews resort to such markedly different language for its webpage (“applicants admitted to this school”) when the terminology on its methodology questionnaire had so many more common usages available (“enrolled, first-time, first year (freshman)”)?

Is it possible that this is their way of making the pretty significant disparity between Wesleyan University and Bowdoin College’s rates of reported test-takers (which are historical) and the other T10 LACs (that have been TO only since COVID) a little less confounding?

In other words, is it just possible that Wesleyan and Bowdoin’s USNews profile pages are correct after all?

I also listed this discrepancy in my initial post. The USNWR website says “admitted” students, but the listed scores match enrolled students. The questionnaire sent to colleges only asks about “enrolled” students, so USNWR should not have information on admitted students.

My guess is it relates to different people at USNWR having different goals and motivations. Whoever created the text on that particular page of the USNWR website may have thought it sounds better from a marketing and rate of subscriptions perspective to say “admitted” on the USNWR “admissions” page, rather than enrolled/matriculated. The group that handles the questionnaires sent to colleges uses more precise language that is not as marketing friendly or casual reader friendly. They don’t casually interchange admitted vs enrolled, spell out full-time vs part-time, transfer vs first-year freshman, degree seeking vs not degree seeking, submitted vs non-submitted, etc.

1 Like

Well, to be more precise, they match the scores of enrolled students who have submitted their aptitude test results. Isn’t that the point you are trying to make?

1 Like

I mean aside from the discussed issue with the few colleges that request test optional kids to submit scores prior to matriculation (Bowdoin and Wesleyan), the USNWR stats match the enrolled student CDS stats, implying USNWR is listing enrolled stats, not admitted stats.

For example, continuing with the example #11 ranked LACs I listed earlier. the 2022 CDS “enrolled” score ranges and USNWR score ranges match exactly. Note that I am listing the CDS 25th SAT + 25th EBRW, rather than CDS 25th composite. USNWR survey requests the former, not the latter. This is consistent with the lack of precise language explanation above – USNWR text implies combined scores are within listed range, when they are actually sum of 25th math + 25th EBRW. This lack of precise language sounds better from a marketing/subscription perspective.

CDS and USNWR Score Range for LACs ranked #11
Barnard: CDS = 1440 to 1550, USNWR = 1440 to 1550
CMC – CDS 1440 to 1550, USNWR 1440 to 1550
Middlebury – CDS = 1410 to 1540, USNWR = 1410 to 1540
Grinell – CDS = 1380 to 1530, USNWR 1380 to 1530

1 Like

And none of them require test optional kids to submit scores prior to matriculation as far as you know. Isn’t that correct?

1 Like

Presumably, we can put this thread to rest:

"After publication of the 2024 Best Colleges, an anomaly in the code used to output those rankings was discovered. As a result of correcting the anomaly, the rankings of 213 schools have changed from what was published on September 18, 2023. We therefore published updated rankings for those schools on October 27, 2023.

“Below is a listing of each school impacted by the anomaly in the code and a notice of what that school’s rank was upon publication and what its rank is after the correction. The schools are sorted by their category and then by their corrected rank.”

Corrections to the 2024 Best Colleges Rankings | Morse Code: Inside the College Rankings | U.S. News (usnews.com)

Looking at National Universities and LAC’s, seems like most were moved to lower rankings.

Since I don’t see a proportional rise in rankings in their linked charts, I assume that they didn’t re-report the now elevated rankings of other schools, who automatically might have risen one spot.

E.g., if someone moves from 1st to 4th (in the regionals), then someone else is now on the podium?