Admitted students yield at record rate

<p>Could the difference between the avg of students who submit SATs and the avg of all enrolled students really be as disparate as 1430 vs 1349? What % of accepted applicants submit SAT scores? I know at Bowdoin they've reported 82% submit SATs, which I would guess increases their reported SAT avg by ~ 30 points. I've gotten the impression that Middlebury is lower than that, but not sure by how much? If the spread of 81 points is really accurate, that would mean only ~ 55% of students submit SAT, which is lower than I would have guessed.</p>

<p>The number of enrolled students who submitted test scores has increased dramatically over the past few years. For the class of 2010, 87% submitted SAT I scores. Just two years earlier (class of 2008), only 50% submitted scores.</p>

<p>gellino....the change in SAT reporting occurred between Class of '08 and Class of '09....the 2004 CDS, representing the incoming Class of '08 September enrollees ( <a href="http://www.middlebury.edu/NR/rdonlyres/42FAF252-1252-402F-85A4-9DAD72D92B57/0/04cds_c.pdf%5B/url%5D"&gt;http://www.middlebury.edu/NR/rdonlyres/42FAF252-1252-402F-85A4-9DAD72D92B57/0/04cds_c.pdf&lt;/a> ) reports a 25-75% SAT range of 1380 - 1500....mean of these numbers is 1440, close to what you've reported. The next year's CDS ( <a href="http://www.middlebury.edu/NR/rdonlyres/F5D55D7F-AD70-45D3-8068-15E2E28D1B8E/0/CDS2005_2006.pdf%5B/url%5D"&gt;http://www.middlebury.edu/NR/rdonlyres/F5D55D7F-AD70-45D3-8068-15E2E28D1B8E/0/CDS2005_2006.pdf&lt;/a> ) reported a range of 1280 - 1475, with a calculated (by me) average of 1378. Scroll to Midds fact book admissions history which report only averages, and the respective Class of '08 & '09 SAT averages (again, Sep enrollees only) were 1356 and 1349 [can't explain the fairly large difference between 1378 and 1349 other than a skewed distribution].......the fact book notes that the averages are derived from all students who submitted them, not those just for admissions purposes...as arcadia mentioned. But the big difference I think between Midd & Bowdoin is that pre-Class-of '09, Midd had this extra category of exclusion for their CDS SAT numbers (numbers that get to US News)....those that did not submit for admissions evaluation puposes, but eventually submited post-admissions.</p>

<p>So, bottom line, the big external reporting change started with the Class of '09....coincidently about the same time the new admissions dean started.</p>

<p>From the fact book table, here's a list by year (fall of freshman enrollment) of % of Sep enrollees that submitted and SAT averages (M + V averages summed).</p>

<p>1996 66% 1287
1997 66% 1338
1998 73% 1314
1999 73% 1328
2000 73% 1330
2001 72% 1332
2002 74% 1358
2003 80% 1338
2004 77% 1356
2005 84% 1349
2006 82% 1364 (W681)</p>

<p>For reference, 2006 above is Class of '10......[arcadia, I can't explain why your numbers vary from these fact book numbers...where did yours come from?]</p>

<p>To your question gellino on disparity between avg scores before and after this change, given these numbers, one can only surmise that the population of students from the Class of '08 and prior who did not submit scores for admissions evaluation, but did eventually submit them to be included in these fact book averages, must have had SAT scores significantly below those that submitted for admissions evaluation. But if only half, say, of the scores were counted in the 1430/1440 average, and only the top half of scorers submitted them, then the bottom half would have an average around 1270.....latest 50% mid range is 1280-1475...quartile values not too far off those theoretical averages....so this all kinda makes sense to me....just happy for the sake of clarity and uniform comparisons that Midd has changed their SAT reporting practice....now if they'd just add in the Febs, I'd be happier!</p>

<p>ps.....gellino, I'm on a one-man mission for Colgate to actually start posting their CDS's, as well as report enrollee averages & ranges rather than those stats for accepted only, a practice which may be innocent in intent, but potentially misinterpretted by non-sophisticated consumers, as US News has many of us thinking SATs ranges for enrollees only....Tufts & others do the same thing.</p>

<p>My numbers came from the CDS for fall 2004 and fall 2006.</p>

<p>Papa, how did your son fare in admissions this year? Where is he headed this fall?</p>

<p>arcadia, thanks for asking....admitted to Tufts, Colgate & several others....WL offers from Midd & 2 others.....after visiting Colgate in April & loving it, a school he hadn't given too much consideration before, he opted to decline getting on all the WL's & accept Colgate. Honestly, although these schools all have their differences, he would have fit fine at any of them, including Tufts, Colgate & Midd. He's very happy & having Midd as #1 choice a few months back seems like a distant memory. Lessons learned- seemingly passionate preferences for a 17 year old have a way of morphing fairly easily, & real happiness can be found with many choices, not just the "one." One down, soon to be 11th grade D is next in line!</p>

<p>I haven't looked at USNWR in a while to know what they report, but it seems that Princeton Review, collegedata and college board all report enrolled SAT stats for Colgate. I'm not sure how they get the numbers.</p>

<p>My understanding is that Princeton R, US News & the others get their info from the CDS, which is originally entered by each school. The CDS initiative has listed 3 leading cooperative members, two of which are US News & College Board. From my understanding, the college also enters many the same fields of survey info (including enrollment & SAT range) to the federal Dept of Education, & are required to do so under certain regulations & definitions, if they receive Pell Grant money. One thing I can't quite figure out is how Princeton R & others get average SAT scores....only 25-75% ranges are reported on the CDS and by the feds, so somehow either that range info is transformed to averages (although this is not statistically proper in my book), or the secondary purveyors of the info are supplied that data by the schools outside the CDS and fed systems. Timing wise, US News is always out about a year from the curent info....CDS' are usually published by January or so by the colleges, after the latest US News Best Colleges edition hits the stands, so, for instance, the 2007 edition (most recent) uses the CDS info from incoming Fall 2005 class.....the 2006 CDS is now posted on many school's web sites, & that base info is more current than US News. Unlike Midd, Colgate does not post a CDS nor fact book, so we are always delayed a year to see what US News (or others) report for Colgate. IMO, Colgate's reporting of "accepted" scores don't help much, as they are usually 30+ points higher than what ends up being representative of the incoming class.</p>

<p>arcadia-- thanks, I had not thought about the CDS C8 % submitting section in regard to our discussion at hand...I think looking at those data compared to the factbook data might help show something....</p>

<p>Year - CDS reporting % - Factbook reporting %
2004 - 50 - 77
2005 - 78 - 84
2006 - 87 - 82</p>

<p>While 2005 & 2006 show some differences which I can't explain, the big difference is between CDS & factbook for 2004, the last year Midd reported SATs for just the folks who turned them in for admissions review. The 27% differential I bet is a rough measure of the amount of enrolled students that turned in SAT scores AFTER they were accepted, that is, they did not turn in SAT 1 scores for admissions review but did turn them in later.</p>

<p>The difference between accepted and enrolled SAT is certainly something to take into account and Colgate should post both on their website. At Harvard, there is probably no difference. I know from what I've seen in data at Colgate and Bucknell that the difference is ~ 40 points .</p>

<p>So Gellino, are you saying that Colgate's SAT scores, as reported by U S News and World Report, is inflated? How many other schools do you know that do this?</p>

<p>maof4-- Colgate's SAT scores as reported by US News are not inflated. Those scores, like every other college reporting, represent the mid 50% range of ENROLLED freshmen. They are what they are. </p>

<p>ENROLLED freshmen are a subset of ACCEPTED freshmen. Without exception in my experience, the academic credentials of ACCEPTED freshmen are greater than ENROLLED freshmen, as the high flying accepted students tend more to enroll elsewhere than the lesser credentialed ones. As gellino noted, this phenomenon lessens as the selectivity of the school increases, i.e., the SAT scores for accepted & enrolling Harvard freshmen should be about even.</p>

<p>Colgate, Tufts & many other schools report a similar-in-format, but different-in-meaning range on their admissions web sites, namely mid 50% range of ACCEPTED students. ACCEPTED students' SAT scores tend to be higher than those that eventually matriculate. Purely speaking, these accepted students scores are not "inflated" either...they are what they are. My point was that I feel that use of the accepted SATs can be deceptive to unsophisticated consumers because most folks are used to reading US News and seeing enrolled SATs....apples & oranges though....schools that report accepted SATs look much more selective to the unfamiliar used to looking at enrolled SATs.</p>

<p>Now, there may be some reason why schools have reported accepted scores...perhaps this gives a prospective applicant an idea about where they might fit in the accepted group, I don't know. Personally, I prefer seeing the SAT score ranges & averages for the enrolled freshmen. Plus, it disturbs me to imagine that a school might be manipulating this stuff to look better, as I suspect is the case for a few, but perhaps not all.</p>

<p>ps...relative to how other schools were reporting, I WOULD say that Midd's SAT's scores were inflated on the Class of 2008 and prior US News & equivalent reports...different reasons why though.</p>

<p>Thanks for the thoughtful response, Papa. Given that Colgate is very protective of their yield numbers-- they accept a large proportion of their freshman class through the early decision program etc., it would make sense that the discrepancy between the scores of those who are admitted and decide to attend and those that pick other more prestigious schools should be narrower than at schools that admit fewer applicants through ED since those who are admitted through ED commit to matriculate.</p>

<p>Out of curiosity, how does Middlebury manipulate their SAT figures</p>

<p>What I was saying is that on their website that Colgate was only reporting accepted students' SAT, but that USNWR and other sources (princeton review, collegedata, petersons) were reporting enrolled student SATs. </p>

<p>Middlebury doesn't seem to manipulate their SATs anymore, but in the past (because of SAT optional policy) would only report the SAT for those enrolled students who submitted SAT and so effectively ignoreed the SATs of the bottom 40-50% of their class when reporting the enrolled SAT avg.</p>

<p>If I understand you correctly, gelinno, Colgate only reports the scores of accepted students on their website, but other schools report the scores of admitted students. Princeton, Fiskes, U S News etc. report enrolled students stats so the only inconsistency is in what the various colleges decide to put on their websites. Am I right about this? This practice shouldn't really be a problem as long as the school is clear about what they are reporting. I suppose it is a good thing to have qualified applicants applying even if many of these students are using the school as a fall back position.</p>

<p>maof4...on the "manipulation" question, read posts 27 & 43. Bottom line I believe is that Midd, for whatever reason that I am certainly not privy to, only counted in a subset of SAT scores for their official reporting purposes for the class of '08 and prior years...those counted scores happened to be the higher scores, making their reported scores look exceptional high relative to other schools. Other institutions counted a larger population of enrollees, making comparisons to Midds numbers impractical. </p>

<p>I think the word "manipulate" might be a bit strong as it implies intent which I have no direct indication of, nor have I heard of any credible conspiracy theories that convincingly prove manipulation. It is very conceivable that the way they did it was just the way they had always done it because they only formally collected the scores used for admissions purposes, being a quasi-SAT optional school perhaps. Likely the same legacy reporting issue for their Feb inclusion, or at least their lack of including Febs in SAT reporting...just the way they've always done it I guess....course they do appear to count Febs into their yield numbers which makes the numbers look slightly more favorable. As a consumer, I'm into consistency in how the numbers from various institutions are compiled to enable apples-to-apples comparisons...IMHO Middlebury still has some work to do here to be best-of-class (& others like Colgate & Tufts have a longer way to go)....so I'll continue being a broken record. This critical discussion should not take away from the fact that Middlebury is a top school with high selectivity (low acceptance) and high appeal (high yield rate).</p>

<p>I don't think there's any ambiguity regarding Middlebury's reason for not counting SAT I scores for all students pre-2008. The former admissions director clearly thought that if SAT I scores weren't used to evaluate those candidates who chose not to submit them, then they shouldn't be lumped together with the SAT I scores of students who did submit them. Right or wrong, this policy did have a beneficial effect on the college's reported SAT I averages. </p>

<p>The new admissions director decided that all scores should be listed, even if the admissions office used other standardized tests (SAT IIs, ACT, IBs) to evaluate the applicant (per their wishes).</p>

<p>There was one benefit to applicants in doing things the old way--they had a better idea of what would be considered a good enough score to submit for evaluation. Take, for example, a student who scored a 1300 on the old SAT I. Should that student submit a score of 1300, or use another test to satisfy the testing requirement? If the average SAT I score of kids who choose to submit them for evaluation is 1410 (as is most likely the case considering the college's older SAT I profiles), then a 1300 might not seem like the best option. If the college publishes the SAT I scores of all enrolling students, regardless of whether they were used for admissions or not, and that average is closer to 1350, then a 1300 doesn't sound so bad. This can be misleading to the student who doesn't realize that the college doesn't consider the lower SAT scores of kids who choose other standardized test options. </p>

<p>Does that make sense?</p>

<p>arcadia, I certainly see your logic of the last paragraph. I just think it's ridiculous of Middlebury candidates into being forced into being so strategic about submitting their application. Either require them or don't use them. Also, reporting those who submitted to Midd in the avg, while easier for comparison to applying to Midd, makes comparisons with schools like Colgate and Tufts more difficult.</p>

<p>Alas, I think both views have merit as there are two purposes here....(1) comparing academic credentials of enrollees at various colleges & (2) judging one's own credentials amongst prior accepted students at one school to assess "chances." </p>

<p>As the CDS and US News and their information fields appear here to stay, I have focused personally on the metric they use, enrolled student's SAT mid 50% range. That metric is fine in my opinion for the comparison shopping purpose. Knowing the range of accepted students who submit them for admissions evaluations would also be nice to know, as long as consumers had the other comparable SAT data and knew the difference.....that last one is asking a lot and I think there would always be some confusion if 2 SAT metrics were used. Asking for the world in Midds case, it would great if they added SAT averages or ranges breaking down Sep admits (submitted for evaluation) and enrollees as well as Feb admits (submitted for evaluation) & enrollees....I'm sure they have the data & there's plenty of room in their fact book!</p>

<p>maof4...check out these 2 dated threads on SAT 1 Writing scores (class of '10) as they were being reported by colleges.....some discussion on the the merits of applied-accepted-enrolled stat groups & some examples of various colleges' reports:
<a href="http://talk.collegeconfidential.com/showthread.php?t=235657%5B/url%5D"&gt;http://talk.collegeconfidential.com/showthread.php?t=235657&lt;/a>
<a href="http://talk.collegeconfidential.com/showthread.php?t=237689%5B/url%5D"&gt;http://talk.collegeconfidential.com/showthread.php?t=237689&lt;/a&gt;&lt;/p>

<p>personally, I think the most forthright reporting starts with the enrolled stats, then includes accepted stats clearly qualified as such, and the best also report the applied group, scores tending to be lower than the other groups....don't get too much meaning out of applied numbers, but its just a more complete package & with 3 sets of numbers, it amplifies the fact that enrolled & admitted groups are different. Amherst, USC, Providence & Princeton are the poster children on complete disclosure here, with Providence even breaking down all 3 sets by ED populations too.</p>

<p>Regarding the reporting of accepted student SAT scores vs. enrolled student scores, It's pretty clear that only enrolled student data should really count. "It doesn't matter who you invite to your party - it matters who shows up".</p>