In the case of Wesleyan, U.S. News reported standardized scoring information for accepted students, which does not comport with the information in Wesleyan’s Common Data Set. In the cases of other colleges I have checked, the profiles represent enrolled students, and comport with the CDS information for these colleges.
These are the U.S. News profiles for Wesleyan:
SAT Middle Range
1440–1550
ACT Middle Range
33–35
These are the CDS profiles for Wesleyan
SAT
1320–1510
ACT
30.5–34
Of the questions this discrepancy raises, the most concerning may be whether this information was also used in the determination of Wesleyan’s rank (directly or indirectly), and, if so, can the rank for Wesleyan, as well as those for other colleges, be trusted? Even if one does not accept, or fully accept, the principles of U.S. News rankings, discrepancies such as this seem problematic.
I pointed out this possible misinformation with a fair degree of hesitation. Nonetheless, I posted this topic to get additional sets of eyes on anything I might have missed or misinterpreted. The topic is intended for CC members who have an interest in analysis. The reliability of U.S. News is the topic here, not Wesleyan. (Although Wesleyan may have, likely unintentionally, provided misinformation to U.S. News, and then Wesleyan would compose part of the problem. I have no idea.)
No need to explain. I’ve been reading your posts for a very long time and it never crossed my mind that your motives were nefarious. I know you to be a ranking enthusiast and truth seeker.
I myself have to apologize and need to better recognize that my sarcasm doesn’t always come across clearly for everybody. But just to be clear, almost all my posts today about the rankings and movements were entirely tongue-in-cheek, which is consistent my previously stated view of the relative importance of numerical rankings of schools. With that said, I hope you will change your mind about my apparent comfort level with misinformation. I have no such comfort level.
On that point, while I, too, have no real idea about the discrepancy you were able to uncover in your investigations, it might be worth considering that, at least to my best knowledge, Wesleyan is one of a pretty small number of TO schools that require all matriculating students to submit their scores before they enroll. So, all TO kids at Wes have to report the scores that they didn’t want to report. And it is those numbers, I believe, that Wesleyan reports to the world in its CDS. The numbers on their website under “Profile of the Class of ____” are the numbers they have in their possession well before the TO students submit their scores.
If we’re concerned about transparency and accuracy, then I would think we would be curious about the fact that Wes’ apples have historically been compared to most other schools’ oranges. Those missing numbers would no doubt bring peer composites down. Or at least that’s what I would expect.
I took a quick look at the methodology write-up. I’m not going to be able to figure this out, but I note that Wesleyan’s rise could also be explained by several of the changes (increase and decrease) in weighting of particular factors. These include: increase in weight to student debt and Wes’ new “no loan” policy; increase in weight to faculty pay, where Wes has always done well; increase in weight to student/teacher ratio, where Wes again does well; decrease in weight in spending per student, where Wes pays a price for being a little larger than its peers; and I’m sure others.
I don’t know what US News does with Wes’ test numbers. Maybe they used the CDS numbers in the model and published the others in the profile. IDK. Consider calling or emailing US News and point it out. I doubt anybody here is going to do anything about it.
There is bad data everywhere-- colleges with satellite campuses; colleges which offer three different enrollment programs (August, which is what they often report; January, which is for wait-listed’s and “almost but not quites”, and June which can include “you need a summer of quasi remediation before the actual coursework starts”.)
Which is why the “rankings” should be taken with a bucket of salt. Not to mention that colleges with large transfer populations can pretty much ignore test scores altogether.
If your question is “Are the US News rankings reliable” the answer is no. If the bigger question is “given the unreliability of the US News rankings, is there still value in using them” my answer would be yes. There are differences between a college ranked #12 and #112, no matter what the “all colleges are pretty much the same” crowd says.
But are the actual, individual numbers meaningful? Not in my mind. Too much noise in the data; too many places where professional judgement needs to be applied, not a broad brush.
There is no category for “accepted students” in Section C of the standard Common Data Set Questionnaire (they only refer to enrolled students and/or, “freshmen”.) The errant score was probably lifted by some USNWR proofreader from the Wesleyan Admissions Office webpage.
Agree with @blossom that you can drive yourself crazy trying to control for all the statistical noise out there.
The US News pages for Wesleyan and a handful of other colleges I glanced at all say:
These are the average scores of applications admitted to this school. Ranges represent admitted applicants who fell within the 25th and 75th percentile.
Edit, nevermind, I skimmed earlier and now see that you all noticed that this is a difference between admitted student scores and enrolled student scores. (For anyone not aware, a college’s admitted student score range is virtually always higher than the range for enrolled students.)
I had long assumed that US News pulls from IPEDs, but I guess not! The only way US News gets admitted score data is directly from Wesleyan.
Could this be a change in the US News process or methodology? Or was it always this way? I have never used US News for finding score data, assuming it to be unreliable due to potential time lag.
Every college profile I have looked at on US News shows admitted student test scores. Obviously that’s going to be different than the enrolled student scores in CDS.
I just looked at last year’s US News page for Wesleyan via WayBack. The footnote about admitted student scores is there. However, the range listed is 1300-1510, which would fit a composite calculated by adding section scores from the 2021-22 CDS. Perhaps a clue.
I randomly checked another school, this time Northwestern. US News currently says 1490-1570. Even though this is supposedly admitted student data, the range reported on the current US News page matches a composite calculated by adding the section scores in the 2022-23 CDS. Last year’s US News page has 1460-1560, which also matches a composite calculated from the section scores reported in the 2021-22 CDS.
I still can’t see where this year’s Wesleyan score range in US News (1440-1550) is coming from.
Just an observation, I’m not sure what CDS this comes from. The 2022-23 CDS for Wesleyan has a reported composite range of 1310-1505. One would only get 1320-1510 by adding the section scores.
Yes, I’m aware that’s inaccurate. I’m just trying to figure out what US News did.
Another random school, Virginia Tech. The composite range reported on the current US News matches a range calculated by adding the section scores from the 2022-23 CDS. And last year’s range reported by US News matches a range calculated by adding the section scores from the 2021-22 CDS.
We have a pattern.
This wouldn’t be the first time a website tried to calculate a composite range from reported section score ranges, without realizing that’s incorrect. (The composite data field showed up first in the 2019-20 CDS, if I recall, and some colleges still don’t report that field.) But why does US News state that these score ranges refer to admitted students? I suspect incompetence.
In doing this, I followed the common practice of U.S. News (even though true combined middle ranges may be available).
With respect to other posts, note that USN apparently uses “admitted” to refer to enrolled students.
The discrepancy arises from Wesleyan showing standardized scoring profiles for accepted applicants, while other colleges (that I’ve checked) show profiles for enrolled (matriculated) students.
That’s because you’re committing the same speed error USNews apparently did by not scrolling down the page. The page is divided between stats useful for gauging admissions chances (“Students Offered Admission”) and further down where the official Class Profile broken into brackets which include Matriculated Students only.
Wesleyan needs to do itself a favor and not include the TO scores in their CDS data until 99% of their peers do the same. It brings their scores down, and they are getting no positive PR juju for the transparency. Just do what everyone else does and report scores submitted for admission amongst enrolled students and withhold the test scores of those who submitted after admissions per the school’s policy. The reported scores will be higher. And just drop the voluntary “admitted” disclosure on the website, especially if it’s causing confusing. In fact, in a “no good deed goes unpunished” sort of way, they’re open to criticism for providing the extra information.
One thing I don’t know about TO is whether a kid who knows they’re not a great test taker can just skip the SAT / ACT altogether so that they never have anything to submit, even after admission. I’ll bet there are plenty who fall into that category.
I thought most schools helped USN by cooperatively sending data. Are we sure Wes didn’t t give them the wrong data (that is they used enrolled instead of admitted)?
This is apparently USN’s term for students who walk through a college’s gates to start classes (i.e., they are “admitted,” in a sense). It represents an ambiguous usage, but, if the ambiguity is thought through (using context), USN’s profiles can be understood clearly.
It would be helpful if you could name just one, other than Wesleyan. I have yet to find another example.
I’d been well familiar with Wesleyan’s site, top to bottom. The topic here pertains to USN’s information for Wesleyan, and why (or if) it differs qualitatively from that shown for other colleges.