Going to visit University of Laverne today!
They have some interesting majors (legal studies/paralegal, Forensics minor, Criminology).
I do not understand how it works in your English department because I was a STEM student.
The way things worked for me in college and grad school was that I had to take classes, understand the concepts, solve many problems to prepare for tests and then perform adequately on a test that required solving problems over 60 or 90 mins. In a way, one has to always prepare for a test.
For my highest level English class in college, I had to study and understand why G.M. Hopkins (and others) wrote what he did and answer specific questions on the test related to his poems â their structure, meaning, etc.
And for what it is worth, while HS GPA is a better predictor of freshman GPA, there are many unanswered questions related to that research and even with those flaws, there is not much of a difference between predictive ability of SAT and GPAâŠIIRC it is something like 82% versus 84%.
The U Chicago study found a much bigger difference between the predictive value of HS GPA and ACT (January 2020, so this was pre-COVID). Test scores donât stack up to GPAs in predicting college success
Link to the actual study: https://journals.sagepub.com/doi/10.3102/0013189X20902110
That is, however, not true. See, for example: Allensworth, Elaine M. & Kallie Clark. 2020. High school GPAs and ACT scores as predictors of college completion: Examining assumptions about consistency across high schools. Educational Researcher 49. 198â211. https://doi.org/10.3102/0013189X20902110.
That studyâwhich, I will note as a sidebar, also worked to deal head-on with the claim that GPAs are too variable across high schools to be reliable, and found that that conclusion is not supportedâfound that the predictive power of HS GPAs is markedly higher than that of standardized test scores (the coefficient, after normalizing for school characteristics, of HS GPA was six times stronger than standardized test scores, so no, it isnât a .84 vs .82 sort of thing). The money lines from the discussion:
It is commonly believed that HSGPAs indicate different levels of readiness for college, based on the high school a student attends, and that ACT scores are consistent indicators. However, HSGPAs perform in a strong and consistent way across high schools as measures of college readiness, whereas ACT scores do notâŠAs measures of individual studentsâ academic readiness, ACT scores show weak relationships and even negative relationships at the higher achievement levelsâŠThe fact that HSGPAs are based on so many different criteriaâincluding effort over an entire semester in many different types of classes, demonstration of skills through multiple formats, and different teacher expectationsâdoes not seem to be a weakness. Instead, it might help to make HSGPAs strong indicators of readiness because they measure a very wide variety of the skills and behaviors that are needed for success in college, where students will also encounter widely varying content and expectations.
Jinx!
There are many studies that are out there. U of CA did a study. Here is a fairly large one done by College Board. I was wrong about the 0.82 vs. 0.84 but it is pretty close and my point stands. Over 250,000 students with data reported by colleges of all sizes and reputation:
College Board would not be the first place I would look for an objective study on the predictive value of standardized tests.
The largest problem that I have with using GPA as a predictor for college success is that it doesnât take into account brain growth or emotional development. I can say with a great deal of certainty, that my 23 is far more prepared to focus, complete assignments and study than he was as a freshman at 14. Predicting his ability to do well in college by the performance of his immature 14 year old self, or really his disconnected Covid lock down year, isnât really a great predictor of future success. His 25 sister started high school â college readyâ but for some ( maybe mostly boys??) what theyâre ready to do at 18 is just not reflected in a cumulative HS GPA.
And yet this is a data source thatâs likely unmatched elsewhere. Iâd suggest itâs worth looking at the methodology carefully (as with any study) to identify uncontrolled sources of bias.
Iâm finding, on a quick read, no such obvious biases, and suggest itâs worth taking a deeper look at. I can definitely see a number of additional questions they might have addressed, e.g., around the effects seen in gender subgroups suggested by another recent poster. But itâs worth reading at first glance.
Gee, I wonder if a study on SAT scores commissioned and published by the College Board could maybe, just perhaps, be self-serving?
But even aside from cynicism, there is recent scholarly literature out there that questions the basic claim that standardized tests are quite so predictive of college achievement as Westrick et al. claim. Iâve been looking at a bunch of dissertations and theses recently for something only marginally related to this discussion, so Iâll simply note these that are relevant:
- Bleemer, Zachary I. 2021. On the meritocratic allocation of higher education. Berkeley, California: University of California, Berkeley doctoral dissertation. (Conclusion: Entrance policies that ignore standardized test scores result in admitting students with lower scores, and those students are as successful at timely graduation as students admitted under entrance policies that include test scores.)
- Hallmark, Tyler Scott Lee. 2021. A longitudinal analysis of student retention using neighborhoods as socioeconomic proxies. Columbus, Ohio: Ohio State University doctoral dissertation. (Conclusion: Any effect of standardized testing on college success is effectively simply duplicating the effect of socioeconomic status.)
- Jennings, Anthony Edward. 2022. A psychometric examination of self-efficacy, college readiness, socioeconomic status, and standardized admissions testing: A structural equation model. Purchase, New York: Manhattanville College doctoral dissertation. (Conclusion: In a study that looked specifically at low socioeconomic status students, standardized test scores were not strongly correlated with academic success.)
- Sturm, Hanna M. 2022. The validity of the SAT and ACT. Arden Hills, Minnesota: Bethel University masterÊŒs thesis. (Conclusion: High school factors predict college success better than standardized tests.)
I will, though, readily admit that there is one situation where standardized test scores are decently predictive: Scores on high-stakes exams with similar content. Basically, SAT and ACT test scores are predictive of scores on exams that cover similar sorts of material in similar ways. But thatâs kind of trivial, you know? Not to mention that it simply says that standardized tests let you know how well students will be able to perform on standardized tests, and nothing about success in college beyond that narrow slice.
My S23 that admittedly has been against standardized test since elementary school says that the only correlation between scores is family wealth. I have no idea if this is true but a couple of the dissertations you list seem to point to that.
Except that it is not.
If you actually look at the data, they show that GPA is a (very) marginally better predictor of performance versus SAT. But the two put together is a much stronger predictor. That has been my argument from the get go and one that you seem to have missed. I also posted about how a typical student prepares for and performs on tests. Most logical people will conclude that performing adequately on standardized tests should not be that much of an ask. There will be outliers for sure but so has everything else.
It is easy to fit a narrative to your ideology.
Works every time.
Why donât you state why the data are bad or unreliable?
I donât actually have an ideology about GPA vs standardized testing. I am not a big fan of test optional policies that force each student to agonize about whether to submit scores⊠I would personally prefer to see schools either require scores for all applicants, or go test blind for all applicants, based on what they feel works best for their own admissions.
It is simply that a for-profit company is not the first place I would look for the most objective research on the value of that companyâs product. If the research turned out to be very negative, they simply wouldnât publish it. Itâs great for them to do research to improve the effectiveness of their products, though.
This conversation can go round and round until weâre all dead, and those who believe there is no correlation (or better correlation with GPAs) will continue to think this, and those who believe standardized test scores are the best indicator will continue to think that.
I donât think the studies are going to change anyoneâs minds.
Can we move on?
Kid had his last homecoming over the weekend.
His musical is in two weeks.
He is going on a Model U.N. gathering at Brown the weekend after that.
Midterms after that.
Itâs all going very very fast, at least for me.
Iâd like it all to slow down, please, thanks very much.
Also! If you have a graduating college kid, you should have already booked your hotel. Just a PSA
The bulk of the research disagrees with Westrick et al.'s finding that standardized test scores add meaningfully to predictions of college successâand that bulk of is important, since disagreement among studies is to be expected, and so you have to go with the bulk of the evidence.
But, as @Gatormama notes, this is a perennial debate and not one thatâs going to be conclusively solved on a CC forum. So yeah, Iâm good with dropping the subject here for now.
Ha, we booked our May 2023 hotel rooms as soon as Marriott opened reservations at the beginning of the summer! Not only to ensure we had a room, but also to lock in the cheaper rate. We are paying $170/night while the current rate at same hotel for that weekend is $368, more than twice what we are paying.
[quote=âdfbdfb, post:5263, topic:2041251â]
and that bulk of is important, since disagreement among studies is to be expected, and so you have to go with the bulk of the evidence.
[/quote]indeed- thatâs why systematic literature reviews and meta-analyses are more compelling than assertions based on a few non-peer-reviewed literature. They are out there, and suggest something very much aligned with Westrick et al.
In any case, back to 2023- or forward rather! The reminder about reservations for college graduations makes me wonder about our HS graduations. Whatâs the plan there? Weâre thinking low-key, but weâll still want to snag a dinner reservation asap. Thankfully no hotel rez required!
@Gatormama - Thank you for that (just booked my college grad hotel room), and thank you for getting us back to our previously scheduled programming.
Wrapping up the last 2 supplemental essays by tomorrow and that should wrap up the EA applications. Then off to the RD apps!!! Not sure how many heâll submit as I can feel the momentum fading!
Agree about moving on PLEASE. The kid has submitted the ED and one EA app, three more to go with mid-November deadlines. Then of course the long list of RA schools! Spending several hours per day on the robotics team where he is captain and the homework load hasnât let up⊠yikes. But doing some social stuff too and of course the dungeons and dragons! Canât wait for more apps to be behind us. We added USC to the RA list at last minute - donât think we will be able to physically visit before applying. Wish we had thought of it before and applied EA as it is also a non binding EA but too late now⊠Will be hard to motivate the kid to work on RA apps while waiting for ED decision but the few weeks between ED decision and RA deadlines just isnât enough to put them off until thenâŠ