In my circle - a small data set - most legacy ED deferrals turn into rejections or WL. It’s sort of a courtesy rejection and probably leaves families less angry than an out and out rejection. A soft landing on a hard spot!!
I do know of one major donor legacy who was deferred ED and told that they really needed to see how senior year was going before a decision could be made. The school emphasized to the CC that the deferral was not a courtesy rejection. Not the strongest student, but they did get admitted RD.
If a deferred legacy pivots to another school for ED2 and is accepted, they need to withdraw their application from the legacy ED1 school. As a result, they aren’t part of yield, which by definition is % of accepted students who attend. I agree though that overage, ghis group probably yields a bit better than the rest of the pool.
Just to clarify, I am assuming if what would have been some deferrals to admits ends up being deferrals to withdrawals due to ED II, then the college in question will need to make up that enrollment out of RD. And that will typically require more RD admits than deferral admits, assuming I am right about yield rates. And so the college’s yield would go down for the same target enrollment.
But that’s a lot of assumptions. Indeed, theoretically a college could respond by simply admitting more people ED/REA/SCEA in the first place, and then if anything yield could actually go up for the same target enrollment.
Admissions #1 goal is to get the best class they can, not to optimize the % yield. So the question is whether that legacy kid is the best for the class. If they have to admit 4 “better” kids to yield one, that’s better than admitting the one “lesser” kid who’s a sure thing.
Absolutely. I was just speculating on the descriptive question of whether the rise of ED II has affected deferral statistics. My personal guess is not much, but it would be interesting to know.