<p>
</p>
<p>Not necessarily. If the student is aiming for PhD study in economics, or is interested in mathematical economics, Berkeley’s economics department has courses more aligned to those interests.</p>
<p>
</p>
<p>Not necessarily. If the student is aiming for PhD study in economics, or is interested in mathematical economics, Berkeley’s economics department has courses more aligned to those interests.</p>
<p>
</p>
<p>I think that most universities and colleges make the internal correction in presenting median scores of both tests as to which one of the submitted ones was the score that helped the student gain admission, ie, higher score of the two.</p>
<p>Example:</p>
<p>Let’s say for expediency that five students can in a stupid way represent what the typical frosh class (those who’ve matriculated) to a college wrt scores; I’ll go back to presenting two instead of three-part SATI:</p>
<p>S1: SAT 1410, ACT N/A, Highest: SAT 1410
S2: SAT 1320, ACT 31, Highest: ACT 31
S3: SAT N/A, ACT 32 Highest: ACT 32
S4: SAT 1430, ACT 30, Highest: SAT 1430
S5: SAT 1370, ACT N/A, Highest SAT 1370</p>
<p>So for the frosh class, 80% submitted SAT’s, and 60% submitted ACT’s. So according to what you’re saying, rhg3rd, CMC would report 60% SAT’s and 40% ACT’s in this example.</p>
<p>I. Median Scores if both are reported at full levels instead of culling out highest representing 140% of the scores within the medians: SAT 1390, ACT 32 (rounded to higher).</p>
<p>II. Median Scores if highest is culled, representing 100% based on one or the other being highest that led to student being admitted: SAT 1410, ACT 32.</p>
<p>This is, I’m guessing, one of the tricks to presenting higher median scores, and I think colleges (perhaps just about all) make this adjustment. The difference in the presentations of I and II, would be much higher if the matriculated student body had larger differences in scores between SAT’s and ACT’s for each student.</p>
<p>Cal presents both at a percentage significantly higher than 100%, but I cannot tell whether it makes this correction to the score that was highest for each student in its median presentation.</p>
<p>Similarly, UCLA presents both, let’s say a mix of both at a range of say, 120-140%, but it does not apparently make the correction that most colleges would of presenting the median (range) of best score per each student. </p>
<p>UCLA, in fact, tries to under-report scores to try to persuade those with lower scores in applying to the U. Therefore UCLA obviously has an anti-self-promotions thing going in not presenting its best foot forward to try to attain a higher-tiered stat applicant pool (by trying to dissuade those with lower scores in applying, as well as to show how selective its admissions is … and thereby raise its profile among all colleges and therefore the people). I did a study of superscoring and other things to show that UCLA does try to under-report scores in at least three things. Cal does maybe one of the three things I listed, but I cannot tell if it does more. The information that Cal provides doesn’t allow for such analysis; whereas, UCLA is perhaps the most transparent u in the country wrt admissions. </p>
<p>Btw, I think a lot of you posting are doing such with the idea that the OP (or OP’s son) is still trying to decide. The decision was probably made over a month ago.</p>
<p>And in my example, the scores are a little too uniform. The presentation of median scores allows for significantly less uniformity without an appreciable drop in “standards” of college, and even allows them to hide out significantly lower-tiered students that don’t fit the profile of their “typical” freshman. For UCLA this would probably be athletes, for others less athletically based, legacies.</p>
<p>
</p>
<p>Better yet, attend Cal and do both (EECS). :D</p>
<p>^ Even better still, attend Cal and do both EECS and Haas.<br>
It’s been done before…with 16 A+'s:<br>
<a href=“05.05.2003 - Innovative engineering and business graduate Ankur Luthra named University Medalist”>http://www.berkeley.edu/news/media/releases/2003/05/05_ankar.shtml</a></p>
<p>
</p>
<p>Meh, I heard of a UCLA grad who studied anthro, and is a top-flight programmer. He apparently did programming before he came to UCLA, so the anthro major was just due to interest. </p>
<p>(Btw, UCLA also has a dual major in CS and engineering, also a computer engineering pathway in EE, and a puuuure major in CS, all of which are Capstone majors which help the student be more job marketable by his/her portfolioing the project for potential employment, as well as it being researchy.)</p>
<p>Unless, they’re revising what Zuckerberg majored in, I thought I read that he was a sociology major. Some of the things I’ve read now seem to infer of his not having majored in it. </p>
<p>Well, since Zuckerberg quit after his sophomore year, I am not sure you can say he majored in anything. He might have declared a sociology major, I have no idea.</p>
<p>^ Thanks, fallenchemist, this makes sense.</p>
<p>Just a few points of clarification wrt my post about seven posts back in my example of how colleges tweak the scores presentation.</p>
<ol>
<li>Someone asked why I didn’t present the five students as medians of quintiles of students representative of a college’s scores from top to lowest.<br></li>
</ol>
<p>This wasn’t my intention, to show tiers of scores within a college’s incoming class. It was just to show how a college could possibly manipulate its scores within its presentation. The percentage of those who take SAT’s and ACT’s should always be > 100%. Perhaps for a college like Harvard, this total might be lower, closer to 100% because they are probably the most naturally high scorers in the country, so they would probably only need to take either the SAT or ACT only once (and probably with less amount of prep because of their more natural intelligence). If they do take both, it might be more to challenge themselves.</p>
<p>The further problem with proposing that the five students could be the various quintiles would be that I would have still needed the amount of students in each tier that took both tests, and which ones solely took the SAT or ACT. </p>
<ol>
<li>Someone I know asked, why I didn’t present a 75%-ile and 25th% scores to be more authentic.</li>
</ol>
<p>This, also, wasn’t my intention, and also, it would have taken more thought to compile a list of students and their scores; perhaps, a good no. would have been ~ 20 students or so because one needs this many to state that a particluar student represents the 75th and 25th without having to employ an average between two students where these medians would lie. </p>
<p>Note: The CDS presentation flows from the no. of students who have taken both tests, and again, this total should be > 100%. What follows is the score presentation by 75th and 25th of both SAT and ACT. I’m thinking that these top quarter and bottom quarter medians were intended for all who take both tests. However, I believe that most colleges probably restrict these medians to the score that helped the student gain admission, ie, the one that was highest in comparison between the two tests, by using highest percentile scored between the two, resulting in the 75th and 25th for both tests equaling one test for each student who matriculated. </p>
<p>@drax12 - That’s a good question, if schools select only one of the two scores to report their numbers, if a student took and turned in both. I suspect you are right, that the college selects which one makes them look better, which is also the one they used to make the admission decision. So I suppose that is fair.</p>
<p>^ This is still lying. There is no selecting which of SAT or ACT to report.</p>
<p>CDS §C9 asks:</p>
<p>“Percent and number of first-time, first-year (freshman) students enrolled in Fall 2013 who submitted national standardized (SAT/ACT) test scores. Include information for ALL enrolled, degree-seeking, first-time, first-year (freshman) students who submitted test scores.”</p>
<p><a href=“http://www.cmc.edu/ir/CDS_2013-14.pdf”>http://www.cmc.edu/ir/CDS_2013-14.pdf</a></p>
<p>ALL means ALL. What CMC is falsely representing is that no one submitted both. This is an obvious lie. There are only a few LAC’s which pull this stunt: CMC, Grinnell and Washington & Lee. No university tries such. Plainly it is not a good idea after the earlier publicized misrepresentations by CMC.</p>
<p>Of course, USNWR could just drop colleges that want to cheat.</p>
<p>Plus, Scripps, Pomona, HMC don’t fillet scores. </p>
<p>@rhg3rd - Actually, if you read carefully, it says include information for all freshmen, not all information for all freshmen. I don’t know how you can categorically state that there is no selecting by other schools when they report their 25-75% ranges. Just because they report how many submitted both doesn’t mean they used both in the other calculations. You really don’t know. And since the instructions are not 100% clear, I think CMC is well within their rights to select the more favorable one if that is all they considered for admission. In fact, most schools throw out the less favorable standardized test when making the final admission decision if a student submitted both, and for that matter submitted multiple results from the same test. Do you really think they use all the times a student took the SAT when making their calculations? Of course not, but if they don’t then they aren’t using all the information, are they?</p>
<p>Want an example of how schools look at the instructions? Here is the University of Virginia CDS. <a href=“http://avillage.web.virginia.edu/iaas/cds/cds1314all.shtm”>http://avillage.web.virginia.edu/iaas/cds/cds1314all.shtm</a> They report the average GPA in C11 as 4.22 for the latest class. Hmmm, don’t the instructions say to report this on a 4.0 scale? They are definitely not the only ones. UC Berkeley used to report 4.1+ average GPA until they realized their mistake.</p>
<p>No. CMC is also lying on USNWR survey by giving the same answers. Here’s what a survey looks like at questions # 158 - 161.</p>
<p>Test Score Submission (CDS C9) : In the following questions, please provide the percent and number of first-time, first-year students enrolled in fall 2012 who submitted national standardized (SAT/ACT) test scores. Include information for ALL enrolled, first-time, first-year (freshman) degree-seeking students – full, or part-time-- who submitted test scores, including students who began studies during summer, international students / nonresident aliens, and students admitted under special arrangements. Do not include partial test scores (e.g., mathematics scores but not critical reading for a category of students) or combine other standardized test results (such as TOEFL) in these items. Do not convert SAT scores to ACT scores and vice versa.
158 . How many first-time, first-year (freshman) degree-seeking students who enrolled in fall 2012 submitted SAT scores?
159 . What percent of first-time, first-year (freshman) degree-seeking students who enrolled in fall 2012 submitted SAT scores?
160 . How many first-time, first-year (freshman) degree-seeking students who enrolled in fall 2012 submitted ACT scores?
161 . What percent of first-time, first-year (freshman) degree-seeking students who enrolled in fall 2012 submitted ACT scores?</p>
<p><a href=“http://www.wesleyan.edu/ir/images/USNews1213.pdf”>http://www.wesleyan.edu/ir/images/USNews1213.pdf</a></p>
<p>If that is all it is, you are getting bent out of shape over nothing. So what if their system throws out the score they don’t use and so it doesn’t get reported exactly correctly for that question? There is zero impact if they say 70% took the SAT and 30% took the ACT, or if they report that 75% took the SAT and 35% took the ACT. Who cares? I doubt they are lying so much as they probably have their data retrieval set up a certain way. It isn’t something anyone measures a school by.</p>
<p>Now if they were actually manipulating their average test scores (or the 25-75 report) that would be different. But this is absolutely nothing. I could easily see a checkbox in their internal form for each student that, within the list of test scores (assuming the student submits more than one) where the admin checks off the test to be used for evaluation. Then when they go to do all their calculations, that is the only one the computer uses. A little sloppy for the purposes of answering this question? Sure, maybe. The end of the world as we know it? Hardly.</p>
<p>In fact CMC could possibly show even higher overall SAT/ACT numbers by reporting something higher than 70% on the SAT and 30% on the ACT. They could throw out the 2100 SAT that comes with a 35 ACT, but keep both the 2300 SAT score that also comes with a 35 ACT. I agree with fallenchemist that it is probably a result of the way the do their data retrieval that keeps the SAT plus ACT percentage at exactly 100%.</p>
<p>@rhg3rd That’s really not a huge point. I’m pretty sure we all know CMC is a great school. Just because one man lied, the overall quality of the school doesn’t change. If you’re basing overall quality off of the USNWR, then ignore CMC anyway and keep trying to get into Harvard or Princeton, unless you think they lied to get to the top, too. </p>
<p>CMC did admit to their cheating. Great. That’s all done and over with. Don’t use it to degrade any other aspect of CMC, because you really can’t. CMC doesn’t hold an atmosphere for cheating; one man made a mistake. That’s it.</p>
<p>EDIT: just noticed your previous response to me. (Please link me, next time). One man lying is not an old habit, if he gets convicted. He is not about to continue lying, because there is obviously more scrutiny on CMC, after the incident. And once again, his lying, CMC’s rank at USN, and the average entrance SAT/ACT scores DO NOT AFFECT THE SCHOOL’S QUALITY IN ANY WAY. None of the teachers have left, because of one pressured man. None of the buildings have disappeared overnight. None of the students have really deteriorated. CMC is still CMC, and if you do your research past USNWR, CMC is still a great school, despite the incident.</p>
<p>Absolutely. Other schools have had this problem. Does anyone think Emory is not a great school even though they misreported numerous data points for a decade or more? <a href=“Emory misreported admissions data for more than a decade”>Emory misreported admissions data for more than a decade; Of course not. USNWR is a farce anyway. Everyone should know that.</p>
<p>
In fact CMC could possibly show even higher overall SAT/ACT numbers by reporting something higher than 70% on the SAT and 30% on the ACT. They could throw out the 2100 SAT that comes with a 35 ACT, but keep both the 2300 SAT score that also comes with a 35 ACT. I agree with fallenchemist that it is probably a result of the way the do their data retrieval that keeps the SAT plus ACT percentage at exactly 100%.
</p>
<p>I disagree, GuideMonk…</p>
<p>The main reason why students switch from SAT to ACT or vice-versa, is they feel they didn’t attain the score they felt they should/could have by, say, retaking or triple-taking the SAT. Therefore, they might switch to the ACT to try to attain a score they think would be more representative of student who would be admissible to their target colleges by their looking at their 75th and 25th %-ile scores, etc.</p>
<p>Sure, for a prospective Harvard student, the 2300 might not be high enough (in addition to the other things the student offers in his/her whole package). But I think there are certain colleges that don’t like a lot of retakes, H probably being one. If someone scored a 1900 SAT and later a 36 ACT (extreme example), that student isn’t more than likely getting into H. If someone scores an 1800 SAT and 2350 the next time, same situation, not getting in, unless he/she blocks that first score. There’s probably no real advantage in taking the SAT and scoring a 2300, and later taking the ACT and scoring a 35. Again, a typical H student might do this as a challenge. But H is also on a different plane than the rest of us. </p>
<p>The incidence of buying scores undoubtedly occurs: paying for high-priced tutors who work with sons and daughters for months, with intensive teaching; kids taking top-tier prep courses, etc; some, I’m sure in combination. So I imagine, it’s even harder to tell the naturally high scorers because the best plan is for the student to prep well before he/she takes his/her first test. I think all this -> SAT, ACT isn’t standardized anymore, and poorer kids are left behind. But if one of these kids does do well, say a kid from Compton scoring a 2200-2300 with essentially no prep, along with all A’s, Harvard will be all over him/her. Again, H -> naturally high scorers. </p>
<p>And I agree with fallenchemist, the presentation of 75th and 25th is in %'s only. There are no hard nos. to cross-check with the total matriculants to see if a college ‘fillets’ nos for best presentation.</p>
<p>Btw, UCLA is the offending party in presenting gpa in weighted form rather than unweighted. But I’ don’t think it’s sleight of hand because the average unweighted gpa for 10-11th grades, in hs courses deemed as a-g, the average is ~ 3.84. Also, the ~4.25 it presents is more than likely capped to 8 courses. The U doesn’t recalculate grades through 12th grade but keeps them in soph-jr grades form, and keeps tabs on students’ senior years to make sure these admitted students don’t contract a moderate-horrible case of senioritis. </p>
<p>Note: Unweighted grades if students’ senior years are included, might drop a bit, but these same students weighted grades more than likely would go up; UCLA reports ~ 20-25% who have 4.0 uw gpas, again, 10-11, a-g, partly because of the tighter restrictions of inclusion. UCLA also doesn’t convert its CDS to include senior grades; I don’t know how many colleges would based on when these forms are released; probably not many do. </p>
<p>@drax12 - I know UCLA does it too, I just didn’t want to list a bunch of schools that don’t pay attention to the 4.0 instruction. I didn’t say it was slight of hand, I was saying they just are ignoring the clear instruction that they are supposed to be reporting UW GPA that has a ceiling of a 4.0. There are lots of excuses for why a school doesn’t recalculate the GPA of incoming students to a 4.0 scale. But it still remains that they don’t. Somehow Berkeley managed to, I don’t know why UCLA (and UVa and some others) can’t.</p>
<p>Why are we even using how an institution reports test scores (just ONE of the deciding factors of rankings) to paint the quality of an institution? Using rankings alone is already a problem.</p>