More Colleges Backing off SAT and ACT Admissions Rule

@purpletitan @ucbalumnus

Well, the counter argument @purpletitan would suggest only students with the top course available to them would do well, and that has not been show to be true, as far as I know.

I think there is also an overweighting of value of HS AP courses, many of which are more about aggressive time management (and “firehosing” information) than a genuine learning experience. (And this is from someone who has watched a number of kids go through various programs.)

I’ve also seen kids who took APs then jumpt to the next level class in college, and I’ve seen some re-do their APs while I’ve watched others who did not do APs then do the college level equivilent. I don’t know that there is any pattern as to how well the HS AP course prepared them for various college level courses. One of my kid’s current HS Civ class even uses the AP book, but is not an AP or Honors, and is one of the most demanding, best college prep courses in the school, simply due to a crazy good teacher. But this is merely anecdote.

I don’t have any studies, but I bet that for most smart, top of class students, whether they got exposed to an AP and pulled the A or simply took the top track “regular” or “honors” at their school, the college outcome is not that different.

For most students I’ve observed, both teaching college level courses and watching kids go through college, knowing how to learn and have a work ethic tend to be way more important than specific knowledge brought to the course.

@lastone03

Again though, the issue is not the amount of chem a kid is learning, it is how they are being prepped for college and how prepared they are to succeed and thrive.

There are plenty of kids who take APs, get As and never take the AP test. They just want the GPA bump. I think the idea that the top kid who got As from a school that offers few or no APs won’t do as well in college as a kid from a socio-economically equivelant circumstance who was offered more APs and got As in them is probably not born out in the real world. But I’d love to hear of any studies.

@lastone03 “I find it unfortunate that Amherst, Williams etc won’t allow AP credits, especially given their sticker prices. I don’t get the logic.”

My somewhat cynical answer is the sticker price is the logic. I think at a lot of top schools, more kids were coming in with more high AP scores, and taking care of most of the “distribution” requirements or whatever a particular university called them without ever taking a course there. This has been seen as problematic, both for turf reasons - some depts count on those student credit hours to look useful, and also for what is our purpose reasons - universities really would like students to take courses with their professors and fellow students.
So many top schools have tightened the rules on AP credits.

Was there any pattern associated with the AP score? It is rather likely that someone who earned a 5 on AP calculus and thought it was easy is more ready to start in a more advanced math course in college than someone who earned a 3 and thought it was hard.

There may be something here, since private schools may want to avoid students graduating early. On the other hand, most students at public schools are being subsidized, so they want them to graduate as quickly as they can, hence the greater tendency of public schools to be generous with credit units for AP scores.

However, subject credit and advanced placement policies may not have as much association with private/public status.

My impression has been the tightening on AP credits has mostly been on the high end private side. And my additional impression is that the pragmatic desire to have students around paying tuition meshed nicely with the educational interest of having them take more of their courses with their fellow students and university professors.

The college board’s papers are largely consistent with other research. It’s more a matter of how you interpret the results. For example, the Geiser UC study at http://senate.universityofcalifornia.edu/_files/reports/CAIR%202007%20-%20Prediction%20of%20UC%20GPA%20from%20New%20SAT%20with%20Tables.pdf found that HS GPA alone explained 19% of variance in first year college GPA among UC students, while SAT alone (including writing) also explained 19%. So looking at this result alone, one might think SAT correlates almost as well as HS GPA, as you stated. However, when they also consider total semesters of honors courses and demographics, then the predictions changed as follows:

HS GPA + Honors/A-G + Demographics – Explains 26% of variance in 1st year college GPA
HS GPA + Honors/A-G + Demographics + SAT I – Explains 29% of variance in 1st year college GPA

Now SAT only explains an additional 3% of the variance in 1st year GPA. When you start looking at 4th year GPA instead of 1st year GPA, the benefit drops further. When you drop the writing section (the most predictive section of SAT), then the benefits drop further. When you also include a good measures of HS course rigor in addition to HS GPA, like was done in the previously linked Ithaca study, then the benefit of SAT scores drops even further, in some cases to becoming negligible.

For example, the study at http://www.heri.ucla.edu/DARCU/CompletingCollege2011.pdf looked at the relative contribution of test score, HS GPA, SES, and many other factors for over 200,000 college students at hundreds of institutions . They found that with a full model that includes academic characteristics, SES/financial characteristics, major/college characteristics, and time spent on various activities during HS they could explain 26.9% of variance in 6-year grad rate with 71.4% successful predictions. When they excluded test scores from the model, the prediction dropped by only 0.1% from explaining 26.9% of variance to explaining 26.8% of variance, with both successfully predicting graduating for 71.4% of students… essentially no difference.

It’s also important to note that all of the links above are explaining less than 30% of variance in metrics of college success. 30% is not very high. The bulk of the variance in college academic success relates to other things besides these few captured stats.

@CaliDad2020: “Well, the counter argument @purpletitan would suggest only students with the top course available to them would do well, and that has not been show to be true, as far as I know.”

Straw man argument, since that isn’t what I said. I said that I think it is likely that rigor matters, so those kids who do well in the hardest classes tend to do better at the next level, on average, than those kids who do well in a less-hard class, not that only those who take the hardest classes do well at the next level.

@PurpleTitan

But the issue is schools that don’t offer APs, or as many APs. Now, this seems to be less of an issue these days as I think most schools are offering a lot of AP courses. But for a long time, it simply wasn’t an option (and still isn’t in some schools) so I’m not sure you can conclude that a kid who takes and excels in the top courses in a give HS (even if those include few or no APs) will perform any worse in 4 years of college than a kid who got As in APs in the same subject.

“I’m not sure you can conclude that a kid who takes and excels in the top courses in a give HS (even if those include few or no APs) will perform any worse in 4 years of college than a kid who got As in APs in the same subject.”

Again, you’re making the argument that academic rigor doesn’t matter. Only innate talent does. I think we both agree that the American K-12 education system is extremely unfair and downright tragic in how low-SES kids are treated for the most part. That doesn’t mean that differences in K-12 academic rigor do not matter at all even though it seems like you really really want to believe that that was the case.

^ To put it another way, @CaliDad2020, if rigor did not matter, as you state, then there should be no need to equalize K-12 education in this country as everything would come down to innate ability and regardless of how challenging (or not challenging) the highest level at a HS may be, all HS students everywhere would have an equal opportunity to succeed in the toughest majors in the toughest colleges regardless of what K-12 education they received.

@PurpleTitan

You are making a leap that no one ever claimed, and you are giving way too much weight to the difference between a regular or honors course and an AP. The leap from “a kid who can excel in a regular course would likely have exceled in an AP if given the chance and is also likely to excel in college” to “everything would come down to innate ability…” is kinda far. LIke, off the charts far.

One would have to wonder how a guy like my Dad, who was educated in an actual 1 room school house of about 25 kids in all grades got to Harvard. He had no APs, took some classes basically by himself, with a book, in the corner, asking for help if he got confused.

Or me, who went to a public HS school that offered 0 APs (back before APs were cool) but the public HS still ranked in the top 10 in the state, produced outsized # of NMS etc. and lots of kids who succeeded at top schools (Princeton, Penn, Duke, MIT…)

According to the AP folks 20.1 percent of public high school graduates in the class of 2013 earned a 3 or higher on an AP Exam, compared to 12.2 percent of graduates in the class of 2003. So you would argue that a full 8% of the entire class of 2013 was signficantly better prepared for college than the class of 2003? So we should see, in overall college grades, graduation etc., a similar increase in 2013 over 2003?

If you want to go for logical absurdity, your argument should find that the US HS population is acheiving at an increasingly higher rate every year as more schools add APs and more students take them. But I don’t believe stats show that. I think part of the problem is giving APs some mystical power. Really they just involve more information, not more “ability to learn.” And sometimes not even more information. I’ve seen good “regular” HS courese with good teachers that are far more intellectually challenging and rewarding than APs with poor teachers.

What I think is true it that increasingly, schools with college bound populations demand APs, so it has become almost the norm that for HS that have a large college bound populations a good number of APs are offered, and therefore taking APs would be a good determination of college readiness, but more because it is an indicator of the students socio-economic background, not due to the AP itself.

And we can debate this in the abstract all year, but Bates, George Washington, Ithaca etc. (truly all colleges, not just the test optional) have real skin in the game and they should know well enough how much taking an AP rather than the top ‘regular’ class available better predicts a students success in college.

This discussion started with a trend in higher education to back off standardized test scores as a means to evaluate applicants for acceptance. The number of test optional schools is growing and being tracked by FairTest, " a non-profit that advocates against high-stakes testing in university admissions and public schools because of its potential negative consequences". I’ll be honest, I read FairTest’s mission statement and they appear to be against anything that tries to measure what is being taught or anything that measures a teacher’s performance. A lot of it seems to be overreaching, as far as I can tell.

The bottom line is for kids to succeed academically, they need access to good teachers, good parents/mentors, and a community that is focused on academic growth and achievement. In order to see that all students are given equal opportunties and are progressing at similar levels, there has to be some sort of metric available. For college bound students, AP Exams, SAT/ACT exams provide this. Are they perfect? No. Are they socio-economically biased? Maybe, though I know many kids that bought $40 prep books and did very well on their exams (mine included). How did this happen? They studied, had the support around them and practiced with free resources they found on line. And while some find testing overrated, I find sole reliance on GPA’s to be overrated. There is nothing uniform about grading scales and policies. Unless a university is devoting time to study each student’s academic trajectory and high school academic policies, they don’t really know how a student has progressed over 3-4 years or truly know the student’s mastery level of the material.

With regard to the comment that some kids take AP’s just to get the GPA bump and don’t sit for the exams, then that’s just a game. It should only reinforce that kids that do sit for the exams and achieve 3,4,5’s should be allowed to use those credits towards electives in college. If a student is in a rigorous STEM major, then let them re-direct the credits toward another elective as opposed to negating the achievement (and the cost).

@Data10 I like seeing the data. Two things always strike me about these presentations. One is that since demographics and SAT are correlated (the very thing people complain about) any stat that uses demographics but not SAT still captures some of the SAT. Thus I think the first year grades going from 26% of the variance to 29% when adding SAT is actually a pretty big jump given demographics were already included.

The second is that fourth year grades are effected enormously by major. The difference in average grade in upper level courses varies by major by as much as a full point in a 4 point scale. Since majors are themselves sorted by SAT score to a not small extent, this can have a large confounding effect. I’ve always wanted a study of grades within classes by SAT score, etc. First year grades are better than fourth year grades in this sense, but the correlation of various parameters with grades within a given class would be telling to me.

And it is sobering how nothing anyone has been able to come up with - grades, tests, essays, EC, interviews…, nothing accounts for much of the variance. Humans are complicated and life has big random components.

@CaliDad2020

“I think the idea that the top kid who got As from a school that offers few or no APs won’t do as well in college as a kid from a socio-economically equivelant circumstance who was offered more APs and got As in them is probably not born out in the real world. But I’d love to hear of any studies.”

In all of these academic discussions, there is little to no mention of brain development and the levels of maturity in an 18 year old. There is also very little mention of social issues and their impacts on students. I would assume that is why a study like you mentioned probably doesn’t exist as each student’s circumstances are different. With all of the variables, I would think it would be tough to put them in buckets and say this group will fail and this group will succeed.

With respect to a student not afforded the opportunity to take AP classes in high school, I don’t think that would have any bearing on whether a student could be successful in college. There are plenty of kids who took nothing but college prep classes, and they are thriving.

@CaliDad2020, you’re using anecdotes, I’m using averages.

As sure, it is possible for non-AP classes to be as rigorous as AP classes. I went to a HS that offered no AP classes either but was one of the top ones in the country.

But using your anecdotal datapoints to draw the conclusion that, on average, top kids from schools that don’t offer APs do just as well as top kids in AP classes is a bit of a leap.

Doesnt this whole discussion about the near complete inability to predict future performance from GPA/curriculum just prove that SATs matter. I would think doing a study within 6/8 carefully selected schools -say Stuyvesant, Trinity and other top public and privates (preferably without APs)-- spread out across the country looking at grades, SATS and ACTS and performance over time -through college would be a better study? You would need at least a few test in schools where there are no special preferences as a separate subset.

Studies that shown low correlation between SAT scores and college performance often don’t take into account the statistical concept of “restriction of range.” That is, the in-school studies are examining the outcomes for students they’ve already admitted, which have a curtailed SAT range. Ithaca college, for example, has an mid-50% range of roughly 1200 to 1350. This corresponds to national percentiles of 82% to 94%, meaning that Ithaca draws half of its students from just 12% of test takers. It would not be surprising if minor differences in scores between these students didn’t correlate greatly with success in their classes.

p.s. a classic example of restriction of range comes from the NFL, where the correlation between weight and tackles among linebackers is zero or negative. This is because the weights of linebackers playing in the NFL is already greatly restricted to a narrow range, and the positive overall relationship between weight and tackles is consequently hidden.

My kids didn’t get a GPA bump (no weighting) but chose to take some AP classes because they were interested in the topic and hoped it would be covered in more depth (often was but not always), or because they preferred to be in classes with the “smarter” kids at the HS and also for college admissions rigor reasons. Their HS also offered a couple of classes known to be the most challenging that were not AP. A junior year Physics/Analysis 2 course combo was taken by every serious STEM kid, but was not AP. In fact the course description explained why it wasn’t, they wanted the freedom to make it “better than AP”.

S did use some AP credits in college, D did not.

Lots of colleges do allow some AP scores to give subject credit and/or advanced placement, which effectively frees schedule space for additional free electives even if no credit units are given.