Acceptance Rates and Yield Rates of top USNWR Nat’l Unis and LACs

<p>I wouldn't say they happen at "every school" but 45 percenter is right; it's certainly possible to have some changes in the numbers because enrollment is never static. </p>

<p>Some schools want the numbers to be ironclad, so they'll wait and not release any data until everything is official. Other schools may be willing to tolerate a little shift so they'll go ahead and post preliminaries rather than making people wait. I happen to work at the first type of school and we spend the first three weeks of school fielding requests for data (which we have to postpone)--I think in many cases people would be happy to have preliminaries even if they changed later.</p>

<p>It's true that numbers may differ due to dishonesty, but why would we assume it in the Penn example given? This kind of discrepancy involves enrollment, where officials know that the numbers are essentially checkable and will come up again and again. It doesn't make sense to cheat on numbers like those. One gains nothing from that. Reporting 20 more students in the class gets them what? </p>

<p>More likely it is the case of using preliminary numbers in one place, and final numbers in another, and being willing to live with the discrepancy because the essential representation of the numbers (approximate selectivity, yield, and size of class) is unchanged.</p>

<p>Found it...my mistake, those admit numbers were just CA freshman:
<a href="http://www.ucop.edu/news/factsheets/2007/fall_2007_admissions_table_c.pdf%5B/url%5D"&gt;http://www.ucop.edu/news/factsheets/2007/fall_2007_admissions_table_c.pdf&lt;/a&gt;&lt;/p>

<p>So, to correct the figures:
Berkeley: 22.9% (10,087 admitted out of 44,077 applicants)
UCLA: 23.3% (11,819 admitted out of 50,667 applicants)</p>

<p>Both campus admit rates are around 23% since this data does not include a couple hundred admits deferred to the spring.</p>

<p>


</p>

<p>UVA likely has >50% yield because it is the only "flagship" public campus in the state.</p>

<p>California has Berkeley and UCLA cannibalizing each other for students. If UCLA didn't have a similar reputation to Berkeley (arguably), the yield would be much higher for Berkeley.</p>

<p>hoedown,
Most top colleges will publish their CDS, but I am trying to understand why some don't and how to interpret their presentations. I have heard about and seen the presentation of data on various schools' websites which differ sometimes in significant ways with the real truth of what is happening at a university. For example, sometimes schools will present their ADMITTED students data for things like GPA, class rank and standardized test scores rather than ENROLLED student data. Or some schools will present data for their main undergraduate colleges, but exclude groups of students in other, separate undergraduate colleges that might have lower statistical profiles. Or even some schools will exclude non-US students in their calculation of the institutional averages. </p>

<p>This has prompted a number of questions, eg,
1. What obligation (regulatory?) does the college have to accurately and completely report this information and to whom does it get reported?<br>
2. After the CDS, is the IPEDs data the next best thing for getting an accurate picture of a school?<br>
3. Also, what disclosures are the colleges obliged to make to data gatherers like collegeboard.com and USNWR about the completeness of their responses to various inquiries? </p>

<p>If you could provide any clarity on this, it would be appreciated. Thanks.</p>

<p>I don't think I'll provide much clarity, unfortunately.</p>

<p>1) Requests come from various places. It's hard to generalize. I believe colleges are obligated to report numbers to the federal government if their students receive federal aid, and IME you don't mess around with federal reporting. I don't know what the penalties are for inaccurate data, but I would assume it could include loss of that aid. Public colleges may also be obligated to report numbers to their state governments, but what those numbers are varies by state. The consequences for providing inaccurate data to the state can be politically unpleasant (at least here in Michigan) and that can have fiscal impacts that last decades. I don't think that colleges have a "legal" obligation to report accurate numbers to Barron's or USNews or the like--answering at all is a courtesy. But reporting false numbers is unethical and unprofessional--and it also can come back to bite the institution. </p>

<ol>
<li><p>I don't know; I think it depends on what "picture" you are looking for. I'm not sure I'd even agree that the CDS is the best source. The CDS is a one-size-fits all document and sometimes that fit is pretty poor. It isn't easy to answer some questions and sometimes the answer we give (to fit the "standardized" document) isn't an answer as complete and helpful as prospective students probably want or need.</p></li>
<li><p>I have no idea what "disclosures" are required. If I were those organizations, I'd do some spot-checking against other federally-reported data to see if answers are in line, but that's just what I would do. There are a lot of reasons why colleges would want to play nice with the college board. </p></li>
</ol>

<p>As for your first paragraph, admitted student data maybe appealing because colleges have it much earlier. Such reports also give students a truly accurate idea of what it takes to get in. I guess they could make the same argument for why they'd eliminate students/programs for which the admissions requirements are very different (such as for a dance department, or for students whose specific subject tests are the main criteria). But I would guess the answers for why they do things one way or another vary across institutions.</p>

<p>

It could also include going to prison and/or paying a criminal fine. Knowingly making a materially false statement or representation to a federal agency or official about a matter within the jurisdiction of that agency or official, is a federal crime punishable by up to 5 years imprisonment and/or a fine:</p>

<p>Title</a> 18, United States Code, Section 1001</p>

<p>hawkette, could you post the latest updated University list ranked by yield? </p>

<p>thanks</p>

<p>hoedown,
Thanks for your response. As you know, the enrolled data is a much more accurate reflection of the true undergraduate student body quality and the statistical data for enrolled is almost always at a discount to the admitted data. Any school can admit lots of qualified candidates and make itself look more selective than it actually is-the trick is getting them to enroll. </p>

<p>As for the CDS and the IPEDs data, I like them because they make possible better comparisons of various colleges. I have found them to be the most accurate and reliable sources and can usually help determine if any students groups have been left of the USNWR et al reporting. Some institutions will report all of the data on all of their students and it is disclosed in a CDS; others will not include all undergraduates and frequently these universities don't provide a CDS and thus can hide these students from their statistical profile. As I have seen it, it is harder to play games with the numbers in a CDS and, if that is taking place, one can usually figure it out. </p>

<p>Your dance example is a good one and surely there are other departments at some universities that are less dependent on the applicant's academic statistical data. It might be a leap to judge universally the motives that institutions have in excluding these students in their reporting, but I think it is a safe guess that the exclusions probably produce a more attractive student statistical profile than is the case if all of the students were included.</p>

<p>I'm with Hawkette and xiggi: if a college refuses to publish a CDS, then it is (highly) likely to be hiding something. For example, only recently has USC started publishing numbers on its site in a similar fashion to Penn. But, yet USC still does not publish its CDS. Prior to its current data posting, USC's reputation was that it excluded special admits such as recruited athletes, low stat development admits, etc., from its published data. And, without a published CDS, we can't be sure if they still do (or ever did).</p>

<p>
[quote]
For example, only recently has USC started publishing numbers on its site in a similar fashion to Penn. But, yet USC still does not publish its CDS.

[/quote]
</p>

<p>Do we have a new nickname for USC? How about University of Suspect Criteria?</p>

<p>I answered with the assumption that we understand that enrolled stats and admit stats can differ, and why. </p>

<p>
[quote]
Any school can admit lots of qualified candidates and make itself look more selective than it actually is-the trick is getting them to enroll.

[/quote]
</p>

<p>I don't agree, actually--not all schools CAN admit lots of qualified candidates--because not all of them have a deep applicant pool. </p>

<p>It's the intentionality you imply here that doesn't ring true to me.</p>

<p>The premise that institutions admit people with the aim of "looking more selective than they are" seems off. I would bet that those schools admit top candidates because they'd really like to have them enroll. It's not a game to improve their "admitted" scores. </p>

<p>Also, profiles of enrolled students appear all over, including USNWR, so it's not like institutions can plan on hiding that information. </p>

<p>Sure, schools do benefit when a less-careful reader confuses the admitted profile with the enrolled profile. However, I don't think they make admit decisions with an eye to capitalizing on this. </p>

<p>I do agree that yield is an important piece, and schools who somehow attract top candidates to apply but cannot get them to enroll are going to have admitted/enrolled stats that really differ. I am not sure which schools really fall into that category, or how many make a regular practice of reporting the admitted profile.</p>

<p>Some people on CC gripe when schools allegedly reject good candidates whom the school assumes applied as a safety. They are chided for trying to make themselves look more selective than they are. Now, in this thread, we have schools chided for admitting those top students....because they are allegedly trying to make themselves look more selective than they are. We are a cynical group on CC sometimes.</p>

<p>I'm not chiding the schools for admitting top students. They should admit them. But I am saying that those on CC who post about admitted students as being the most accurate representation for the quality of a school's student body are probably overstating it as the admitted data (at least for most of the highly competitive universe) profile is almost always stronger than the enrolled profile (yield data also supports this). </p>

<p>I think we probably see this in a similar way-schools should accept who they want based on the schools' criteria for selection and the numbers will end up being what they end up being. I don't think (hope!!) that schools falsely admit students to boost their admitted students profile. Either way, it would be nice to see the schools being more forthcoming in their disclosures (public CDS) and also not promoting their enrolled student data over their admitted student data.</p>

<p>bluebayou and UCBChemEGrad,
I know that you are not big believers in the rise of USC over the recent past. I am with you in wishing that they published a CDS. By not doing so, they only invite suspicion as folks like you with longtime knowledge of the school will likely scoff at the recent surge in their statistical data. If they are not including all of their undergraduates and excluding subgroups (athletes, students in less academically-oriented fields, etc), then this absolutely needs to be disclosed. I like that USC has come up in the world and now is a solid choice alongside UCB and UCLA, but I don't like if its numbers are not pure. If that is the case,then their reputation will never equal its competition. And it shouldn't.</p>

<p>hawkette:</p>

<p>actually, I am a believer of USC's meteoric rise; chasing NMFs works well for them. And, they have several top-ranked undergrad programs: film (duh), engineering, biz, among others. Due to the Trojan family alone (and the alum connections), I think it a better bet than UCLA for many kids. But, I would even be more impressed if they came clean with their data. Until then, I can only assume that they have something to hide. :rolleyes:</p>

<p>


</p>

<p>Huh? Not sure I understand. USNews gets their data from a CDS, or from the college directly, if CDS not completed....It would be easy for any school to provide USNews and CB their "officially" cleansed data.</p>

<p>The practices that I sometimes find frustrating are when the data provided by a school's admissions folks runs counter to the data found in the CDS. The University of Virginia is one such case. They do a superb job of providing lots of institutional data and lots of historical data and even provide some great breakdown data on in-state and out-of-state admissions:</p>

<p>IAS</a> Historical Data: First-Time First-Year Applicants by Residency</p>

<p>IS: 7090 applications, 3349 acceptances (47%), 2244 enrollees (67%)
OOS: 10,708 applications, 2924 acceptances (27%), 1004 enrollees (34%)
Total: 17,798 applications, 6273 acceptances (35%), 3248 enrollees (52%)</p>

<p>But their admissions folks post their admitted student data in a way that seems to indicate it is the enrolled student body. Contrast the presentation in the CDS vs the admissions presentation. </p>

<p>CDS Enrolled Students for 2007-08: 1200-1420</p>

<p>UVa</a> CDS - C. First-time, First-year Admission</p>

<p>Admissions presentation: 1280-1490</p>

<p>U.Va</a>. Office of Admission Profile</p>

<p>Hawkette, </p>

<p>I personally prefer USC to UCLA. There is absolutely nothing original about UCLA.</p>

<p>
[quote]
USNews gets their data from a CDS, or from the college directly, if CDS not completed.

[/quote]
</p>

<p>This doesn't challenge your point (if anything it lends support), but I think the clarification is important: Even if the CDS is completed, USNews does not get the data from the CDS! That survey is a monster and they send it to the school for the school to fill out. USNews doesn't do any of the work for any school--unless the school refuses to respond. </p>

<p>
[quote]
It would be easy for any school to provide USNews and CB their "officially" cleansed data.

[/quote]
</p>

<p>Yes, but they'd have to repeat the numbers over and over, in IPEDS, in any enrollment reporting on websites, in college guides, etc. Individual units would also have to be in on it, so no one reports those excluded students. I think it's risky. Now something like SAT averages, of GPAs that's more plausible. That information is centrally maintained and considered confidential, so the number of people reporting on it (and the places they are reporting it) are more limited. You could whittle down the cohort on which you figured the average and not worry about the excluded people being included somewhere else and tipping off a keen-eyed observer. I'm not saying that DOES happen; I'm saying that's where a school might be able to get away with it with less risk. But enrollment is different.</p>

<p>Admittedly, my views may be influenced by the fact that I'm at such a decentralized place. Maybe at a different sort of university you could try to pull something.</p>

<p>hawkette, that example is problematic, to be sure. The numbers don't match, and as you pointed out, there is no text to suggest that those are admitted students, not enrolling students.</p>

<p>Something similar happened here last year but as soon as it was pointed out, the admissions office added a clarification to the misleading part of the website. So in this case, perhaps UVa (DeanJ?) needs to be notified. It could be the case that the numbers/definitions sticklers are in a different office, and the person designing the website is more of a marketing type, and never ran the page by the sticklers.</p>

<p>
[quote]
Now something like SAT averages, of GPAs that's more plausible.

[/quote]
</p>

<p>But, THAT is the data that drives rankings! And, in particular, the SAT is something upon which many cc posters are fixated (including one on this thread).... annually, USC trumpets that their SAT scores are higher than Berkeley's -- maybe true, but we only have their PR department for a data source.</p>

<p>I don't understand your clarification: </p>

<p>
[quote]
USNews does not get the data from the CDS!

[/quote]
</p>

<p>why would USNews not use the CDS (when they say that they do)?</p>

<p>hoedown,
The only time that I really have a problem with a college and their information is when they DON"T make their CDS available to the public. U Virginia or others with admissions departments that want to provide student statistical data can do whatever they want and create different numbers (intentionally or not) that present the college in the best light. But the CDS works as a fact-checker and I believe that the marketplace will smoke out the differences and corrective action will be taken (eg, I hope that U Virginia will adjust their numbers on the admissions site to correspond with the CDS numbers). But the problem, of course, is when an admissions department puts out data about GPA or SAT or even enrollment numbers and there is no CDS to substantiate it. If even one error is discovered in their presentation of the data by admissions departments, then the credibility of the admissions folks is (perhaps rightfully) undermined. No one benefits from this obfuscation.</p>

<p>Re groups of students, I think you said earlier that the CDS and/or IPEDs guidelines do not allow a college to pick and choose which undergraduate students should be counted in the data and especially for things like GPA and standardized test scores. All are counted, even student groups that might be in less academic fields, eg, dance students or music students, etc. Is that right? I'm trying to understand how much discretion colleges have in their reporting for a CDS or to IPEDs. I'm guessing that colleges have greater leeway in what they report to USNWR, collegeboard et al and near full discretion in what they publish on their admissions websites. </p>

<p>bluebayou,
No one would like to see Southern Cal succeed more than me in developing a strong national reputation with a superb student body. Having more strong college options in different geographies is good for college applicants and for all of higher education. Nonetheless, I share your concerns about USC's standardized test scores. That school has seen dramatic improvements over the past decade in its student profile, but to be fully accepted (and believed) by some in the national audience, it would be nice to see the supporting data and see if the reality matches the claims of their admissions/marketing folks.</p>