<p>Some schools report AP weighted GPA on the CDS, and acknowledge this is the reason for such high numbers. The instructions say only to use a 4.0 scale, commonly (not universally) interpreted to mean A = 4 and “AP” A = 5.</p>
<p>Good enough, tk21769.</p>
<p>
</p>
<p>This would be a form of weighted gpa, absolutely. But doesn’t appear to be fully weighted gpa but UC gpa, which is capped as to the number of weighted classes that can be included in one’s 10-11 gpa calculation. </p>
<p>I agree that those that report a form of weighted gpa for a CDS that clearly states unweighted is like hammering a square peg in a round hole. UCLA is an offender when it reports 10-11 grades, a-g course, uncapped weighted gpa at 4.25, when uw 10-11, a-g, would be 3.81. But uw gpa isn’t a factor within USN’s rankings, so essentially no harm no foul.</p>
<p>
</p>
<p>I was just specifically concentrating on the selectivity index within USN’s rankings. The other important factor is obviously SAT medians, with a small importance to acceptance rate. (This is why those who place importance in c’s and u’s generating more apps to improve it’s USN selectivity rank are off base.) </p>
<p>My point is when USN ranks according to selectivity, it couldn’t and wouldn’t have the order correct from top down, and would indeed give a false impression that for instance u “x” would be harder to gain admission than u “y,” based on wildly different statistical reporting of the frosh classes reported for each, when this wouldn’t necessarily be the case. </p>
<p>Alexandre obviously addressed some of the other stuff within the data manipulation of supposed objective statistics.</p>
<p>“for a CDS that clearly states unweighted”</p>
<p>Do schools routinely clearly state this? The CDS documentaion says:
[Common</a> Data Set 2011-2012](<a href=“http://www.commondataset.org/docs/2011-2012/CDS_2011-2012.htm]Common”>http://www.commondataset.org/docs/2011-2012/CDS_2011-2012.htm)</p>
<p>double correction for the same word,</p>
<p>‘…those who place importance in c’s and u’s generating more apps to improve it’s [sic] [becomes ‘its’ becomes ‘their’] USN selectivity rank…’</p>
<p>Back to my original reply of yours: </p>
<p>All I’m trying to prove is that statistical data with many producers of supposed similar data sets cannot be reliable, especially when something like ascending rankings, ‘improving one’s lot,’ is involved. </p>
<p>For data to show more accuracy, there needs to be centralized persons who calculate all data with consistency or those with capability to audit this data. Neither is obviously going to happen. So all c’s and u’s run free in reporting this data, which is wildly divergent.</p>
<p>Okay, vonlost, so when a u reports C12, they add the extra notation/verbiage in reporting unweighted. (I’ve been having a hard time replying timely since you showed up. ;)) I guess it’s up to the u to specify which when reporting. I thought when I’ve seen others that the CDS specifically stated, “unweighted.” So UCLA isn’t an offender.</p>
<p>The CDS, section C7, invites colleges to specify the relative importance of admission factors. One factor is “Rigor of secondary school record”. “Academic GPA” is among the others. If GPA were on a 5.0 scale designed to capture course rigor by allowing 5s for AP courses, then there would not be much point in calling out “Rigor” as an admission factor. It would be implicit in the GPA, no? But rigor is called out; the instructions for section C11 explicitly indicate “using 4.0 scale”. So it seems fairly clear that the CDS intent is for colleges to report GPA on a 4.0 scale that is calculated without extra credit (5s) for AP courses.</p>
<p>Evidently, that intent is not clear enough to some schools.
Here are the GPAs reported by some of the USNWR top 40:</p>
<p>4.25 UCLA (5.0 scale?)
4.0 UCSD (rounded up from 3.96?)</p>
<p>3.89 Princeton
3.84 Berkeley
3.71 JHU</p>
<p>Not Reported: Amherst Williams Pomona Dartmouth Brown
(some of these add a comment such as, “POMONA DOES NOT RECALCULATE GPA TO A
COMMON SCALE”)</p>
<p>I assume it is not the case that no student at UCLA or UCSD ever received less than an A in HS … but, if they were deliberately spiking their stats, I would think they’d report more plausible numbers (like the 3.96 UCSD reports on its own site). I fault the CDS for not being even more clear in its instructions. It should add statements such as, “Report GPA to 2 significant digits” and “Report AP course grades on the same 4.0 scale as others.”</p>
<p>
Huh, this is a really strong accusation. Do you have any proof this is occurring at the best private schools and that the seminars are just fillers with no value? In my experience as a recent alum from a well-known private school, seminars are often the best taught classes with a lot of student-professor interaction and plenty of reflection upon the coursework prior to and following the lecture. Class sizes matter since being in a small class forces you to engage more with the material because it becomes clearly apparent who has done the readings and who has skimped out in these small intimate environments. When the professor is excited to teach the course and is in a classroom with 15 or so like minded students who signed up for the class since they are actually interested in the content as supposed to their mandatory lecture courses design to fulfill requirements, what results is an academically charged Socratic environment that brings out the best in both the instructor and the pupil.</p>
<p>If USNWR is incentivizing private schools to lower class sizes to improve their institutional standing in the annual rankings, then I say hats off to Robert Morse and company. The ends justify the means and whatever the intention, small seminars benefit all involved.</p>
<p>My alma mater had no classes that I can think of that even remotely resemble Berkeley’s infamous “Physics for Poets” or “Physics for Future Presidents” courses.</p>
<p>
Super scoring has been proven to have little effect on SAT/ACT medians since scores simply don’t tend to fluctuate that much. Most students don’t gain more than 20 or so points no matter how many times they retake. Also, public universities like UCLA may not superscore for admissions purposes, but do you have proof that they don’t report the best subscore for a student for the SAT/ACT for CDS or USNWR reporting purposes? Just because one standard applies for admissions purposes, that doesn’t mean that the same standard applies to data reporting. We all know that universities want to be viewed in the best possible light.</p>
<p>Where has that been proven? Adding just 15 points to each section bumps the total score up quite a bit. A 2040 looks much better than a 1995.</p>
<p>tk21769, #46</p>
<p>
</p>
<p>Im not sure why this has importance with you. The us are just placing checkmarks by the things they consider important and are filling out the form. Is it important that the student try to learn in high school? Kinda… Are test scores important? To just about all … obviously, yes. Im not sure what your point is.</p>
<p>
</p>
<p>I would think that C12 would follow from C11 wrt uw scale. Im not sure where vonlost quoted from in his/her snippet. So generally, I would agree with what youre saying. Although, again, C11 and C12 have no play in USNs rankings.</p>
<p>
</p>
<p>Again, I understand what youre saying. Youre only confirming that theres too wide a play in how a u or c reports its CDS, even if it werent trying to game the USN rankings.</p>
<p>The 5.0 scale is a misnomer, btw. UCLAs 4.25 is not out of a possible five points because an all-student mean of 5.0 -> all AP courses in all matriculated students 10-11th grades, a-g, weighted, uncapped. Some high schools offer as many as 30-32 AP courses, and a student at one of these high schools has on occasion approached 5.0 final gpa and even occasion reflected this gpa in his/her grades (with some bonus, A+ grading within a 4-point system obviously), but there are high schools that only have a handful of APs and ultimate w gpa at one of these can at best approach 4.2 or so. (In which case, a precocious student at one of these high schools should consider taking courses at the local community college and maybe even forego his/her remaining years at his/her hs, by GEDing, etc.)</p>
<p>UCSDs 3.96 is capped weighted gpa, a-g, 10-11 grades … UC gpa. UCLAs is uncapped, a-g, 10-11th. The difference for various gpas will be as follows for UCLA, Cal, UCSD:</p>
<p>UCLA:</p>
<p>Unweighted GPA: 3.81 (10-11th, a-g)</p>
<p>UC GPA: ~ 4.12 (10-11th, a-g, capped at 8 APs, max possible ~ 4.40)</p>
<p>Uncapped Weighted GPA: 4.25</p>
<p>Final 10-12 GPA: ~ 4.40, maybe 4.50 (UCLA will monitor students senior course offering to maker sure rigorousness is involved in taking a high proportion of hss APs, along with this being in students best interests in obtaining as much AP college credit at the U.)</p>
<p>Cal:</p>
<p>Unweighted GPA: 3.84 </p>
<p>UC GPA: ~ 4.15 </p>
<p>Uncapped Weighted GPA: ~ 4.30</p>
<p>Final 10-12 GPA: ~ 4.45, maybe 4.55</p>
<p>UCSD:</p>
<p>Unweighted GPA: ~ 3.64 </p>
<p>UC GPA: 3.96 </p>
<p>Uncapped Weighted GPA: ~ 4.08</p>
<p>Final 10-12 GPA: ~ 4.20, maybe 4.30</p>
<p>Im not sure if the above is adequately illustrating the different gpas that UC involves. Admission to Cal and UCLA are based on uw gpa, the first listed, along with uncapped weighted gpa, the third, along with both looking at specific course grades. UCSD might involve the UC gpa, but it could do similarly to Cal and UCLA. Mostly for UCLA and Cal, UC gpa is just a marker for minimum qualification. Final gpa is important to both because both want to make sure that senioritis hasnt crept into the students mindsets.</p>
<p>I might want to question Princetons reported uw gpa of 3.89 because it is a legacy-intensive U. Might they be leaving a material subset out of the equation? Do they calculate uw gpa differently than the examples I gave above?</p>
<p>
</p>
<p>This is what I meant by some leaving the CDS data point blank.</p>
<p>
</p>
<p>Im not sure why UCLA feels compelled to probably mis-answer the request. But its uw is ~ 3.81 as I stated before. And its obviously not meant to misdirect USNs selectivity index because, again, gpa has no play in it.</p>
<p>A better question is why UCLA reports a 97% t-10% statistic, which has significant weight in USNs rankings. Actual at UCLA is ~ 80%, which would mean it would be overstated by ~ 17%. But 80% is extremely high for all us and cs and there are some that misreport this stat by ~ 30% or greater. </p>
<p>I dont think there are many us or cs that have 90% t-10% except for maybe Cal Tech (non-legacy Intensive U, science-oriented, etc).</p>
<p>bustazak #47:</p>
<p>
</p>
<p>Regarding the bold:</p>
<p>Because this would take another calculation by the University to report a superscored score as scores are line-itemed by date, with higher scores bubbled to the top and also if not the lower one’s blocked by the students. </p>
<p>Also, UCLA calculates all SATs and ACTs reported, the combination of the two being 125-139% of the total frosh class. UCLA is actually trying to downplay its statistical data to cast a wider net in its applicant pool, ie, to encourage those from poorer backgrounds to apply who may not otherwise because of a daunting nature of higher reported stats. Once this is accomplished, UCLA will holistically take a decent proportion of those with lower scores and even grades and try to come closer to its ideal of “diversity.” This is usually a public’s main concern, not to ascend the USN’s rankings. Btw, I don’t agree to the amount that UCLA bypasses 3.8/4.7/2100+ students for this ideal.</p>
<p>So, no, your last sentence is not completely right. Those private us in the 15-25 USNs range might have this motivation, absolutely to try to improve their USNs lots … slots.</p>
<p>Considering that not all students re-take the test, that some scores go down, and that the scores only count for 7.5% of the USNWR rankings, I doubt that superscoring affects the rankings much at all. That’s not to say it does not matter to individual students appraising their chances at individual schools.</p>
<p>tk, according to a study I saw, superscoring adds 20-30 points per section. That’s not significant to us and nobody in the real world would pay attention to such an insignificant margin, but to a high school kid, a range of 1350-1550 is significantly better than a range of 1290-1490.</p>
<p>Secondly, if the superscoring were the only criterion where universities could manipulate data, the 7.5% would indeed not be that significant. But there are several other factors that come into play, and when you are 2%-4% here and there, eventually, they add up to considerably more. Let us not forget that in the USNWR formula, a university can move 20 or so spots with the addition or substraction of 10-15 points.</p>
<p>I have said it before and I will say it again, since the USNWR is clearly not serious about auditing data for consistancy and accuracy, there should be a ranking for private universities, a ranking for public universities and a ranking for LACs. Mixing private and public universities the way the USNWR does not make sense.</p>
<p>
The USNWR has some flaws for sure but consistency isn’t one of them. Harvard, Yale, Princeton, Stanford, MIT, Caltech and Duke have been in the top 10 ever since the ranking adopted its current methodology in the late 1980s.</p>
<p>Here’s what’s been constant since 1990…
- Harvard, Yale and Princeton have always been in the top 5.
- Stanford, MIT, Caltech and Duke have always been in the top 10.
- The rest of the Ivies, Northwestern and The University of Chicago have always been in the top 15 except for Penn in '94 and Northwestern in '90 and '91.
- CMU, Emory, Rice, Vanderbilt, Berkeley, Georgetown, Notre Dame and UVA have consistently been in the top 25.</p>
<p>The only measurable changing trends since 1990…
- USC has gained almost a whopping 20 spots since its ranking of 44 in '96 and not being ranked prior to that.
- Penn and Wash U have risen about 10 spots over the course of the past two decades which have enabled Penn to be a perennial top 10 school and Wash U to be a perennial top 15 school in the modern era.
- Michigan is perhaps the only university to lose its relative standing from two decades back to move from #17 in 1990 to #29 in 2010 and #28 in 2011.</p>
<p>In conclusion, the academic powers of the past are still the academic powers of the present and USNWR reflects this reality. The aberrations like USC and Michigan can be explained by the fact that the former has tremendously improved as an institution and has recruited much stronger faculty, enrolled far more qualified students and improved its image significantly in the past two decades to justify its ranking change while the latter has chosen to continuously increase its class sizes which has dragged down its selectivity and improvements in students’ SAT medians relative to its peers and has as a result affected its image among high school counselors (difference between PA score and counselor rating is big in Michigan’s case).</p>
<p><a href=“http://web.archive.org/web/20070908192041/http://chronicle.com/stats/usnews/index.php?category=Universities&orgs=&sort=2007[/url]”>http://web.archive.org/web/20070908192041/http://chronicle.com/stats/usnews/index.php?category=Universities&orgs=&sort=2007</a>
Here is the data if you are all interested.</p>
<p>When researching a college.It might help to study both to find the discrepancy in the rankings and why.
Maybe less popular institutions with excellent value (not $) will pan out.</p>
<p>
</p>
<p>What’s fairly important to me (to the extent any of this is important) is that the CDS issue clear instructions and the schools follow them consistently. You’re right, GPA is not a USNWR ranking criterion, but we’ve already taken the discussion down the rabbit hole of questioning whether CDS statistics are reliable (in the context of questioning whether most of the UNWR criteria are “objective”). Regardless of whether USNWR uses a particular CDS number, I’d like to see all the numbers reported consistently (since they come into play in college comparisons, whether rankings are under discussion or not.) GPA, to my simple mind, should be a simple thing to calculate. The top grade in any course should be an A=4 (whether it’s in basket weaving or AP rocket science); the maximum GPA should be 4.0. </p>
<p>As for legacy admissions, that’s another controversy. Yes, Princeton admits a lot of legacies, at ~4x or more the overall admit rate. Is there any evidence they hold back legacy GPAs from the overall average reported in CDS/C12? I really doubt it. The average legacy applicant at super-selective schools like Princeton may well be more qualified than the average applicant, not less.</p>
<p>Anyway, I think this has been a good discussion. Henceforth I’ll probably be a little more skeptical of the CDS and other admission statistics (though I do think the mere fact we have these documents represents significant progress since I was in school.)</p>
<p>In an adcom I would sure want to know if an applicant’s A was in AP or regular, so it seems reasonable to add something to the AP A to more accurately rank that portion of admission criteria. This matches the CDS documentation’s “Weighting gives students additional points for their grades in advanced or honors courses.” This is vague because the CDS schools do not agree on a single formula; too bad, but what can you do.</p>
<p>So if Pomona does not recalculate, what do they do with the 0-100 scale grades submitted? Leave them out of the CDS calculation?</p>
<p>To me “rigor” does not reflect the difficulty of courses a student took, but rather the ratio of difficult courses taken compared to those offered by the HS.</p>
<p>tk21769, #50:</p>
<p>
</p>
<p>Ascending scores would be dependent of the number of takes, added to the amount of prep added to later takes. (Disregarding those who are naturally high scorers, which most the vast majority aren’t.) This would clearly be a function of wealth. Wealthier students can ascend their scores at a higher level because they have the monetary background to pay for specific tutoring to help them do so, including to taking the test > once.</p>
<p>Let me add this link, a high school in LA county, [PVP](<a href=“http://pvphs.com/pdf/CollegeAcceptance.pdf”>http://pvphs.com/pdf/CollegeAcceptance.pdf</a>)High School. This is updated for the 2011 graduating class. It’s a good study because I believe they’re reporting best component score without regard to sitting, and the 3-component score is per sitting. So maybe sometime if this thread is still around – I hope not ;), then I’ll try to do some analysis of how much scores ascend by superscoring. This hs wouldn’t be typical because it is a wealthy public school. First 30 pages are individual students, second 30+ are university decisions.</p>
<p>Looking around for that Princeton reference of yours…</p>
<p>
</p>
<p>Undoubtedly. But a 3.89 uw gpa is really, really high considering the various admissions within a university’s total undergrad package of admits. I’m just questioning it, that’s all.</p>
<p>
</p>
<p>Good, I’m glad you’re coming around. I tend to question these for-profit publications: how serious are they in trying to report things under uniform standards, etc? As a consequence, how serious would they be trying to assure that the things they report adhere to these standards? USN’s is pretty limited in this because as Alexandre said, they have no audit capability.</p>
<p>
</snip></p>
<p>Guess not …</p>
<p>University of Chicago
Washington U</p>
<p>UC sat down with US News officials to see what they could do to improve rankings. They did what Wash U has been doing for years. Attract and reject more applicants. Monitor other areas and adjust.</p>
<p>Also many game the donations. An example</p>
<p><a href=“http://www.lawschool.com/failstogame.htm[/url]”>http://www.lawschool.com/failstogame.htm</a></p>
<p>Here are the allegations made by Alexandre (condensed):</p>
<ol>
<li><p>Addition of hundreds of meaningless seminars to increase the percentage of classes with fewer than 20 students.</p></li>
<li><p>Breaking down large lectures into smaller lectures taught by the same professor.</p></li>
<li><p>Begging alums for tiny donations and breaking those donations into several years in order to increase the percentage of alums who donate annually.</p></li>
<li><p>Removing graduate students from faculty to student ratio calculations.</p></li>
</ol>
<p>The article you cite would be an example of #3, for law schools (which was not the topic of the original discussion); except that, as that article points out,</p>
<p>
Just one problem: The U.S. News law school rankings don’t count alumni participation. … U.S. News factors alumni participation when it ranks national universities and liberal arts colleges. But it’s been at least 15 years since the law rankings used that measure, said Robert Morse, who oversees the rankings for U.S. News.
</p>
<p>“UC sat down with US News officials to see what they could do to improve rankings. They did what Wash U has been doing for years. Attract and reject more applicants. Monitor other areas and adjust.”</p>
<p>Thanks for posting this barrons. I could never see a public school like Michigan or Wisconsin stoop to this level just to get their rankings up.</p>