Pointing to the admitted student data not the enrolled student data serves to artificially inflate the stats and doesn’t really pass the smell test to me. I don’t recall seeing this at other schools.
Which set of stats are students using to decide whether or not to send their scores?
Common data set uses enrolled and shows a 1310/1490 split with 650/750 on reading and 650/760 on math. It’s for 21/22 so not sure where the other data is from. Maybe 22/23 ?
I’d use enrolled bcuz that’s who did. Interestingly their % submitting test came way up from the year prior to high 70s.
I agree - it’s squirrelly. There should be one set. I think the higher # makes them feel as if they’re coming off as more elite.
On the flipside someone could say you’re getting lesser people than get in so the higher ones are rejecting you for other schools.
Far from being “squirrelly” or not passing the “smell test”, I think Wesleyan actually should be given credit for complete transparency on this issue. Giving more information than is required, i.e. both “admitted” and “enrolled” stats, should be celebrated, not condemned. As the original poster said, I don’t recall seeing this at other schools
I would look at the middle spread. I can’t imagine that the only students enrolling are the bottom outliers. If you are in that 710 to 750 range I doubt submission will hurt you, even if it doesn’t aid you.
However, do we know if these are all scores that were submitted with the application? Some schools request them upon enrollment for data gathering/analysis. In other words, are these numbers being heavily influenced by low scorers that were admitted without a test score?
Those numbers (750/750) are actually listed under “Profile of First Year Students,” which suggests those enrolled–which in fact then contradicts the other numbers (710/700) listed for the same class under “Wesleyan at a Glance.” Worse than a failed smell test, this indicates false information in one spot or the other. If I were in the admissions office, I’d want to fix this pronto.
The gap is hardly surprising in a TO setting. The admitted population doesn’t include lower test scores. At Wes at least, everybody who enrolls has to report their score; so in the latter data set there are presumably lower scores included in the data set, where they are not in the former.
If there is a blow in communication elsewhere, I am confident it was unintentional based on my familiarity with the school and their MO. Someone should just call and point it out. Perhaps I will.
I don’t understand why providing more data rather than less is viewed negatively. Assuming both sets are accurate, I don’t see the issue.