applicants as "pinballs in a machine" & admissions deans "on hamster wheels"

<p>Interesting imagery in an article in today's Chronicle of Higher Education:The Power and Peril of Admissions Data: Colleges collect more information about potential students than ever before, but some deans worry about how the process is changing </p>

<p>
[quote]
"We are the office of revenue and reputation," says Daniel M. Lundquist, vice president for admissions, financial aid, and communications at Union College, in Schenectady, N.Y. "When I started doing this, you could do it with a baseball bat. Now you need forceps. ... The stakes are astronomically higher."</p>

<p>Statistical models like Baylor's work by using the behavior of past applicants to predict how future ones will act. Those data are supposed to help admissions officials meet their enrollment goals, yet many officials say the scramble to collect more and more numbers has placed them on a hamster wheel that never stops spinning.

[/quote]
</p>

<p>and later on:</p>

<p>
[quote]
Only after officials have decided whether to admit or deny each student should they turn to the models to see what adjustments they need to make to the number and type of students they have accepted.</p>

<p>At Union College, Mr. Lundquist, the director of admissions, says he does just that. After a human being has read each application and debated its relative merit with the admissions committee, the predictive model's final calculations of such factors as who will accept an offer, or how much an applicant will contribute to tuition revenue, may determine whether the college accepts or rejects a particular student.</p>

<p>*"It's like balls running through a pinball machine," says Mr. Lundquist. "We put in our picks and run the model to see what the head count looks like, and if it spits out a number *that's higher than our financial-aid target, or has too many engineers, or whatever, we have to go back and take some people out and put others in."

[/quote]
</p>

<p>I italicized the phrases I found particularly striking. </p>

<p>Deans on hamster wheels forced to run applicants through pinball machines with forceps! That's quite a vivid visual image.</p>

<p>The entire article is available in the free section of the Chronicle of Higher Ed here:</p>

<p><a href="http://chronicle.com/free/v53/i08/08a04601.htm%5B/url%5D"&gt;http://chronicle.com/free/v53/i08/08a04601.htm&lt;/a&gt;&lt;/p>

<p>One more excerpt I found striking from the article above:</p>

<p>
[quote]
Although no admissions official interviewed for this article would admit to using statistical models to make admissions decisions, one who asked not to be identified said he was fired twice for refusing to employ such criteria. The official, who has worked for two highly selective colleges, is now employed at a less competitive one.</p>

<p>"The pressure from the rankings guides and the need to meet tuition-revenue targets drives a lot of manipulation of data in ways that are very ethically questionable," he says.

[/quote]
</p>

<p>Thanks for posting that link. The excerpt that I found most disturbing was this:
[quote]
Although statistical models can increase a college's applications, the data those models provide can be biased. Mr. Munce, of the admissions-research center, says colleges that rely strictly on statistics to devise their recruiting strategies can inadvertently exclude underrepresented groups of applicants. Mr. Munce discovered that problem last month when he completed a study of the criteria 300 private colleges use to select names from his organization's database.</p>

<p>Although those colleges sought to increase their enrollment of minority students, the factors they used to find prospective applicants — including ZIP codes and students' stated career interests — had the opposite effect. Many of the institutions he studied were categorically excluding students from inner cities, even if they had 4.0 grade-point averages. The criteria the institutions were using generated recruitment lists on which minority students were included with 30 percent less frequency than white students.</p>

<hr>

<p>"We give them the data they ask for because they are ultimately the customer," says Mr. Munce. "But they are often using criteria that are too simplistic. They try to find kids who meet certain criteria without considering it from a race standpoint, and they don't use common sense."

[/quote]
</p>