Note that ELC does not give automatic admission to the applicant’s choice of UC campus. It means that an applicant shut out of the selected campuses is offered admission to a campus with space available. In practice, this is UC Merced these days.
It is also based on meeting a UC recalculated benchmark GPA for the high school, not current class rank determined by the high school.
So far here, I don’t think anyone has looked at this through the lens of an applicant. When test scores aren’t used, it becomes more difficult to figure out one’s chances of admissions. I’m not advocating for using test scores but the truth is that, when making a list, students have historically looked at SAT or ACT ranges to get an idea of where they might sit in an applicant pool.
Our D applied to all schools test optional last year. It was a nail biter until the first few acceptances came in. She felt good about her GPA, rigor, ECs, essays, recs, etc but we still felt it was risky to not send a score even during the worst of Covid. She definitely applied to more schools than she would have if she had a score to use and we knew where she fell within each colleges’ range.
For CA families last year, many thought admissions to UCs seemed more random. From what I read here on CC, it was very stressful for students. It’s going to be an adjustment.
UCs in recent history had already somewhat de-emphasized SAT/ACT scores relative to HS GPA. Indeed, they have been posting admission rates by HS GPA but not SAT/ACT scores.
These are for the whole campus. Different divisions or majors may have different levels selectivity (usually, engineering and computer science majors are more selective).
Some of the “UC disappointment” threads in the recent past have been from “SAT/ACT discrepant” applicants who applied to UCs based on the SAT/ACT scores, while their (more heavily weighted by UC) HS GPA was relatively low for their selected campuses (and some had additional reasons for disappointing results due to applying to more selective majors).
For some, it may be the opposite – “SAT/ACT discrepant” applicants will no longer mislead themselves into believing that their high SAT/ACT score will compensate for a low HS GPA when applying to UCs.
yeah, I have a friend whose son had a high CA GPA and a 1550 that couldn’t be used and didn’t get into UCB or UCLA. She thinks that 1550 could have helped. He was applying as an engineering major so definitely more competitive. Got into UCSB and UCSD.
Honestly, I believe that a high SAT with a mediocre GPA predicts mediocre college achievement. There are always reasons for poor high school performance from bright kids, but those reasons don’t go away once they’re in college. Trouble at home follows kids to college - so do mental health issues, ADHD, etc. At the same time, a high GPA with a less-than-stellar SAT usually predicts excellent college achievement. The place where the SAT/ACT can be predictive is when you have a kid with a high GPA from an unknown school, or a school known to have low academic standards, and a terrible SAT score. That SAT score confirms that they are not ready for a highly selective college. And this is the very population that the UC system is trying to increase, when they’re eliminating the SAT/ACT from the application. That’s the problem. They’re going to wind up admitting students who are unprepared to do the work, who belong in remedial classes, who will wind up flunking out, who would have been better-served by admission to a less competitive school. And they’re going to deny admission to students who were amply prepared to do the work, and would have benefited greatly by admission to the top flagship state U’s.
Forgive me, but Bowdoin and Bates are not schools that are on the radar of most inner-city CA applicants, nor are they publicly-funded (other than for the fact that they are tax-exempt). The UC system is probably the largest public higher ed system in the country! There are thousands and thousands of extremely highly qualified first generation Americans who are going to be shut out of the top CA institutions by this policy, which is a deliberate end-run around the voters having prohibited racially preferential admissions.
The below doesn’t represent a debunking, as the article states — it would appear to be premature for that — but it does include a challenging opinion from an academic, for those interested:
I understand this is an opinion piece, but I’m not aware of any data that support much of what you are saying…do you have any data that support your statements?
There are many qualified first gen Americans, but I don’t understand why a test blind policy will shut them out. Limiting the discussion to the UCs…the UC application asks if the applicant is first gen. The factors that the UCs consider when reading apps includes first gen (and other factors such as low income, disabilities, etc.). Note that UCLA says 30% of their undergrads are first gen, so they are finding them.
Colleges rarely publish a data file with internal ratings of essays and LORs, which makes this difficult to study. What is your basis for proving that essays are more correlated than SES standardized tests? Is this based on a study with specific numbers, or is this based on an article that said that some rich kids paid someone to help with essays?
If one assumes that LDCs are a proxy for parental SES, then one can makes some estimations from the Harvard lawsuit, which suggests that scores are an especially strong point among higher income applicants, while LORs are a relatively weaker point. The latter may relate to being more likely to attend a highly competitive high school, where it may be difficult to get LORs that talk about being the best in years/career, etc.
What is more readily available is a comparison of income/SES between submitters and non-submitters at test optional colleges. At every test optional college I am aware that has done this comparison, test submitter kids average higher income than test non-submitters. . For example, the report at www.nacacnet,.org/globalassets/documents/publications/research/defining-access-report-2018.pdf looks at income distribution of test submitter and non-submitter kids at 21 test optional colleges. At all 21 of them, the average EFC for test optional attending students was lower than test submitters. The kids who are admitted based on GPAs + rigor + essays + LORs +… non-score criteria consistently have lower average income than the kids who are admitted based on all of the above + scores.
Regardless of whether AP scores are in the “formula” or standardized, as stated in the post, the Ithaca study found that the predictive ability for cumulative graduating GPA was not statistically different with and without scores. Several other studies have come to similar conclusions. The information gained by the combination of additional holistic criteria (holistic colleges do not admit by a simple “formula”) largely duplicates the additional information gained via scores.
The U of C administration has convinced the social justice crowd that they are allies, when in reality the U of C system is building an admissions process that is soaked in privilege. Test blind policies will benefit the wealthy in admissions to an extent that we haven’t seen in decades, and the SJ crew is cheering it on.
Any benefits that the test-blind proponents claim they are fighting for, such as more access for URM and first generation, can be done with standardized tests as part of the process.
I’m curious to see data over time on ELC admits by campus, especially now that UCs are test blind. There’s a notation on an app for ELC status. With so little else to go on - no LORs or SATs/ACTs, I wouldn’t be surprised if ELC status impacted admissions at other UCs beyond Merced, going forward.
With essays and ECs correlating with SES, especially- and you know the UCs are aware of that - it seems like a useful tool to ground the assessments of the students.
We have data on lots of colleges that have gone test optional and admitted a large portion of the class without considering test scores. Many studies have compared performance between kids who submit scores and those who do not. Such studies do not find that everyone is flunking out. Instead they generally find no significant difference in either GPA or graduation rate between kids who submitted scores and those who do not. An example is the Bates 25 years of test optional study at Optional Testing | Admission | Bates College . Some specific numbers are below:
Test Submitters – 3.16 mean GPA, 89% graduation rate
Non-Submitters – 3.13 mean GPA, 88% graduation rate
I mentioned that UCs consider more limited additional criteria than Bates or most private colleges, so there may be more statistically significant differences than Bates; but it is quite a leap to assume a large portion of kids will flunk out, particularly given how poor SAT scores are at predicting graduation rate.
@Data10 I freely admit I have no idea where I read about strength of essays being even more strongly correlated with SES, but I’m sure it was on CC, and it was some kind of study, not just moaning about the existence of paid essay help.
I suspect you are referring to the thread at Study Conducted by Stanford Suggests Essay Content Correlates more Strongly with Household Income than SAT - #8 by Data10 . This study did not consider the admission readers’ ratings of applicants or whether the applicant wrote a quality essay. It only looked at whether the essays had certain key words groupings that were correlated with SES such as China, travel, and business economics. For example, kids who write about traveling to China in their essays are more likely to be high income than average, so a computer seeing an essay with keywords related to China and travel might be able to successfully guess the applicant was high income. However, this does not mean that kids who write about traveling to China in their essays are more likely to receive a high essay rating or more likely be admitted. A quote from the thread is below:
Note that the study did not evaluate the admission readers’ raters of the essay or which students had the best essays from an admission stand point. Instead they estimated the essay topic by using a computer program that counted frequency of key words within the essay, and found that choosing certain essay topics (as estimated by the key word count) was correlated with income and SAT score.
For example, wring essays about “seeking answers” had a high 0.57 correlation with SAT score. “Seeking answers” topic was estimated by a high frequency of using key words like “question, book, like, research, read, answer, ask” within the essay. Similarly writing essays about “tutoring groups” shows a strong negative correlation with SAT score. Kids with higher SAT scores were less likely to write essays about “tutoring groups.”
The correlations between certain estimated essay topics and SAT score seem quite substantial, while the correlations between estimated essay topics and income seem much lower. The essay topic that was most correlated with income seems to be writing about China, as estimated by high frequency of the key words “chines, studi, student, also, time, china, school.” I’m not sure how much some of these key words have to do with China, but I don’t doubt that writing essays about China would be correlated with income – both due to higher income among Chinese international students (UC sample group) and among students who travel to China. However, this does not mean that the kids who wrote essays about China or traveling had better essays from an admission stand point-- only that they were more likely to have higher income.
You were responding to an explanation of why experts believe test scores were unnecessary:
Maybe I misunderstood your point. How is admitting “students who will succeed” a “mighty low bar”? IMO, it is far from easy to succeed in rigorous programs at UCLA and UCBerkeley. It takes preparation, hard work, commitment, organization, and intelligence. Even students with impeccable preparation and qualifications are challenged. So, IMO, in no sense is admitting students who will succeed a “mighty low bar.”
It seemed to me that you were suggesting the “bar” would be higher if tests were used in admissions decisions, but if not then what was your point?
With respect to studying test optional policies, they may be recursive with the curricula of the institutions. That is, a college’s curricula may adapt to the students in ways that cannot be controlled for, or even recognized, in an analysis.
SAT/ACT tests are useful for a number of reasons. The math portions of SAT/ACT tests are really basic. If a student makes more than a few “careless” mistakes, s/he probably shouldn’t major in a highly quantitative field, regardless how s/he appears to have done in her/his math classes. If s/he has a significant gap in math skills and knowledge, colleges aren’t the place to remediate. Similarly, if s/he scores low in the verbal portions of the tests, both s/he and the colleges should take that into consideration to determine if s/he is likely to succeed. In the absence of test scores, a student could be accepted into an unsuitable program that s/he may be unable to finish (or has to switch to a different program at considerable cost to the student and the college).
There are many possible contributing factors, but the end result is selective colleges that have gone test optional do not see the claimed high rates of kids failing out. Instead they generally see similar graduation rates between kids who submit scores and kids who do not. Some additional stats from the previously linked NACAC report are below.