So what is the point? Why does anyone care about trends in college admissions if they can’t use that information for anything useful? Its like asking what the trend in the weather is for Fall, yep its getting colder but does that help my decide when to go on a hike. Nope. This is the kind of useless information that proliferates in the information age.
“The NACAC State of College Admission report is not intended to tell an individual student their chances at random college X or what admissions factors random college X uses.”
Sure, but kids aiming for colleges (or their parents) will cling to anything they think is a clue. The problem is when they stop there. We’ve got to know “grades and strength of curriculum” are important, but not what makes an admit (except at a rack-and-stack.) Unfortunately, not enough take the further steps.
And this thread is introduced as, “where admission officers place the most weight when reviewing applications.” Not as some generalized sampling. The link is titled, “Admission Officers Name the Most Important Elements in College Applications.”
Like @CU123, I don’t care about “trends.” Applying and getting an admit, imo, are not just intellectual/curiosity look-sees. What matters today is what applies today. And, applies to your specific goals.
And, not the CDS.
The NACAC has a different mission and group of target readers than this website. It focuses more on college admissions professionals, including HS GCs that serve students considering a wide variety of different types of colleges. Over 10k+ professionals are members of the NACAC. However, there are many other groups that are interested in an overview of college admissions in the United States and how US college admissions changes over time. This can include students, parents, educators, and government agencies. Such reports can also influence funding and government policies at a variety of levels. There are also many things in the report besides just that one table that some persons find useful, including myself. However, if you are not interested in college admission trends in the United States and instead only care about your personal chances at random college X, then don’t read it.
“different mission and group of target readers than this website.” Sorry to be blunt, but then why is this presented on this website? And with no editorial comment?
No answer needed. But to me, it’s “distracting,” at best.
@Mwfan1921 Sorry, I maybe framed that badly. Two separate questions.
I was thinking of schools like Bates and Skidmore among others with the second part of that post, rather than Bowdoin. I think it’s down to the offer and how did you hear about Bowdoin this year.
I prefer to read the actual report rather than layperson articles summarizing the report, so I didn’t initially read the article summary. Looking at the article on this website, I see that they do quote the report with statements like “73 percent rated grades in college prep courses as considerably important”, implying that it is a survey. However, I agree that the article has some misleading statements, such as the ranked list of top admission factors, with insufficient detail about how that ranking was obtained.
A report on trends in college admission is non unrelated to this College Admissions forum. It’s not a perfect 100% equivalency with the same mission and target readers, but there is enough overlap that I’d expect a lot of readers on this College Admissions sub-forum to be interested in the report, even if you aren’t.
Regarding “demonstrated interest”. Of the colleges in the analysis, 19% accepted < 50% of applicants. Since the colleges which consider interest are generally colleges with acceptance rates of below 30%, then 16% which think that demonstrated interest is really important is actually pretty high.
However, it seems to me that the colleges AOs were not being all that honest, especially for data presented on Table 9 in the article…
@CU123 Oh those who do not understand that joys of playing with data…
It’s not useful at all for parents and students looking to apply in the next year or so. It is interesting and useful for those who are looking at the different trends and processes in higher education, like parents who are engaged in advocacy groups.
What students before 12th grade and their parents and counselors perceive (correctly or incorrectly) as being important for college admission can affect their decisions regarding activities that may be considered in college admission.
For example, students who believe that class rank is very important may be more likely to engage in rank grubbing, and students who believe that test scores are very important may spend more time on test prep. An extreme case found on these forums is parents concerned that not getting into the top math track in 6th grade will adversely affect the kid’s college admission chances.
Since most students before 12th grade do not have lists of specific colleges, they may choose actions based on general perceptions, rather than specific colleges. Such actions may or may not turn out to be optimal for the set of colleges chosen to apply to in 12th grade.
Sorry these types of surveys clearly lead to false conclusions about college admissions. Period.
Only by people who do not understand elementary statistics. Without this data, they would be, instead, coming to false conclusions based on a couple of anecdotes (“my mother’s neighbor’s kid washed cars in the afternoons and now he was accepted to Yale, so washing cars is a great way to be accepted to Yale”). So I don’t think that their inability to understand what the data actually means is a good reason not to publish it.
Another thread that’ll turn into debate?
I’m ok with the CC interest in data, as long as it doesn’t lead to more false conclusions. As it is, kids focus so much on stats and titles. That isn’t corrected for them by pointing to a study that purports to tell them what’s most important.
All the colleges using the Common App agree to be holistic. While it’s absolutely true that many are less selective and want live bodies more than some “whole,” false assurances are trouble.
This anecdote was several young women, two years ago, during the RD round. 2 were accepted. 1 waitlist. 1 denials. One enrolled.
All visited.
The two who were accepted were really the most fervent, perhaps, in the school.
The others certainly were eager as would be with any great school. They weren’t all classmates.
Thank you.
Our advice to our kids about their applications is that after they had established a strong case for admission based on numbers – test scores and grades – at the more selective colleges, including specialized colleges such as music or art schools and programs, they had to establish “points of distinction” based both on organized EC’s at school (e.g., athletics) and on NON-school activities. The latter were EC’s that might be truly beyond the curricula. They might even be solo activities, i.e., not organized.
For our older one, who had very good grades and outstanding test scores, I think it was the points of distinction that set him apart. STATE-level awards in journalism, debate, and math competitions. For our younger one, it was awards for art and design. This was especially relevant because of her desire to attend art school. It wouldn’t much matter if she was in the top percentiles in tests and grades, or a leader within the school. If she didn’t have a strong portfolio she’d have had a tough route to admission. To achieve this, she attended special programs outside the school (outside the state, in her case).
In short, in highly competitive admissions, those who have extra points of distinction and recognition beyond the core curriculum are going to be the more successful.
The overal trend data is indeed relevant and important to parents and students seeking to maximize their admissions chances to selective colleges.
Not only because trends are harbingers of future decision rules but also because of second order effects. For example, if GPA > class rank, then college prep high schools will face severe pressure to inflate grades, which only raises the GPA bar still higher, which leads to more gradeflation.
Which more or less explains what has happened over the last two decades. The average unweighted high school senior GPA nationwide is now 3.69, which is a joke. How can any college have a reasonable decision rule when every other kid in the nation has straight A’s?
Inevitably, the colleges will engage in pooling the applicants. As the Harvard trial shows, and multiple academic studies prior to the trial suggested, applicants are compared within their respective “pools,” or cohorts, not across them.
If only the adcoms would tell us how they pool and what the standard is for each of those applicant pools – GPA, test scores, “personal” ranking a la Harvard’s 1-6 scale, wealth threshold/EFC – then we could plan and prioritize appropriately.
Some are making small steps in this direction. UIUC for instance allows the prospective applicant to see 25-75th percentile SAT and ACT bands for admitted applicants by intended major, for all Arts & Sciences majors.
Let’s hope more transparency follows.
“I wonder if demonstrated interest can also include essays”
It can, but essays are also a separate category, and in fact the highest ranking non-academic category, that I wonder if interest is the traditional ways you show it.
“The earlier survey shows a strong correlation between demonstrated interest and yield. 64% of colleges with a yield of >60% said that demonstrated interest was “considerable importance”, the clear majority.”
And the best way to show interest is applying early, especially ED, and as the Harvard data shows, SCEA as well.
“We’ve got to know “grades and strength of curriculum” are important”
It’s the most important, there’s probably not even a close second. Even tippy tops say that, here’s Yale’s first statement about admissions:
“Yale is above all an academic institution. This means academic strength is our first consideration in evaluating any candidate. The single most important document in your application is your high school transcript, which tells us a great deal about your academic drive and performance over time. We look for students who have consistently taken a broad range of challenging courses in high school and have done well.”
Now if Yale or a similar type college is filling out the survey, what do you think they’re going to rank 1-4, all the academic factors. It starts with the transcript and for many appliacnts, it will end there.
I think it would help greatly if there could be more visibility into the stages of the admission decision process.
In other words, it’s probably the case that in the first screen, academic factors dominate, whereas in the second and third/final screens, other factors rise in prominence.
Also, there’s almost certainly pooling going on ie dividing the applications into cohorts that conform to the major categories by which the university leadership evaluates the success of its admissions process. Obviously these cohorts or pools correspond to the major USNWR metrics-- SAT, grade point, rank, etc-- but equally obviously, the leadership wants to see certain ethnicity/URM thresholds maintained.
Less obvious is the elephant in the room: the admissions process has to optimize for REVENUE.
Without question the adcom’s leadership is aware of, and working toward, some threshold number for tuition revenue that the process, in the end, must attain. This skews the process in favor of those applicants who, the adcoms and their consultants can predict with high confidence, will accept an offer of admission that carries no aid at all. So another pool certainly has to be those minimally-qualified, academically-suitable applicants whose parents’ occupations, ZIP codes, and other non-FAFSA indicators suggest that they will easily pay full tuition-- and not receive aid anywhere.
In short, wealth is another hook.
The only question is at which stage in the admissions review process-- first screen, second or final-- the pooling takes place.
If the universities would simply set some minimum thresholds for each pool and commit themselves to TRANSPARENCY around those thresholds, the process would become fairer, more orderly, and vastly more efficient for both applicants, adcoms, and high schools.
This would not require doing away with “holistic” admissions-- that would still describe the second stage of the decision process-- but it would require transparency around a) definitions of the pools and b) threshold stats-- SAT/ACT and GPA especially-- required of each pool.
For example, Elite College could say, “We require a minimum score of 700 on the SAT Math section of all applicants who are not recruited athletes or members of the following groups…”
For the athletes and the named groups who are exempt from the 700 Math threshold, Elite College would then publish the formula for their Academic Index that is used to evaluate the non-academic pools, and state clearly that applicants from those non-academic pools who fail to meet the AI threshold will not be considered for admission.
Elite College could go on to reveal the % of applicants who will be required to pay full freight, and state the historical 25-75th percentile scores of that cohort as well as of the cohort that receives aid.
Doing the above would have multiple benefits. Transparency would help everyone to understand and accept the logic of the admissions process. It would also make the process much more efficient by using actual thresholds to deter from applying tens of thousands of unhooked, non-wealthy middle and upper middle-class kids whose actual chances are zero or next to zero. Those kids could focus their energies on other schools where they meet or exceed the published thresholds for the unhooked and non-wealthy.
Finally, it would greatly help recruitment of high-achieving under-represented groups. No marketing expenditure, no costly and inefficient “outreach” in the form of adcom reps jumping on planes or driving to schools would be required. Simply publicize the threshold score and then identify SAT and ACT takers who actually meet the thresholds and encourage them to apply.
The result of the above would be to cut the # of applications by more than half, increase the admission rate, and spare everyone enormous amounts of time, money, sweat and tears.
There’s a target stats bar and then the reality that some kids won’t hit it, but still are totally compelling candidates. An MIT rep, eg, said (before the New SAT, which raises epxectations,) that anything with a 7 in front and they know you can do the work at MIT. But people still think they need 800s. Then, miss the rest that’s vitally important. And that rest is what makes your chances decent or “next to zero.”
First pass in an admission read IS what cuts the number of apps roughly in half.
I get that it’s easier to understand if they would only lay it all out for you. But that’s not the sort of kids they look for.
The “Academic Index” IS a fixed formula, among all Ivies.
Yale says, “…academic strength is our first consideration in evaluating any candidate.” It doesn’t say it’s the end all and be all. But people interpret it as such- as a hierarchical measure that can push you forward, despite what’s lacking. It does not mean your stats and rigor top all other considerations and with that, you should be a lock.
How hard is it to match yourself to a top college on more than stats? Not that hard, if you’ve done your due diligence on the college. Not a gimme, but they’re looking for kids who can find the pieces and put them together, show them in their record and the app/supp. And kids who fully understand a single digit admit rate means, no matter what, your chances are slim.
But the bar is different for different cohorts. It’s significantly lower for hooked candidates. For unhooked non-wealthy kids applying to the Ivies, the SAT bar is very likely set far north of 1500.
There’s no transparency on the pools and their different thresholds. That’s the problem.