Distinguishing Quality and Outcomes of Education from Selectivity/Incoming Stats

<p>In the most abstract, people attend college to improve their outcomes in achieving post-college employment. How "outcomes" are measured is dependent on the goals of the individual - some look for the highest salary, others the prestige and/or impact of a job, still others the quality of life/satisfaction of their career. Finally, some are measuring outcomes by the quality of the professional school/graduate school they are able to attend afterwards, which will ultimately be judged by the same criteria as above.</p>

<p>Certainly, college provides other long term benefits as well - becoming an educated citizen, creating a group of lifetime friends, joining a post-graduate network, finding a life partner, and learning leadership and responsibility. I am, of course, leaving out the ancillary benefits such as social life, athletics, etc.</p>

<p>Posting after posting on CC focuses on how selective schools are, or the incoming stats of the students. These statistics are simply not adequate measures in judging the quality of education or more importantly, the outcomes of students. Although the "rankings" of schools by these criteria correlates to some degree with outcomes (because of supply and demand), the correlation is certainly imperfect. For example, schools with great outcomes may be forced to be less selective than they could otherwise be because of some dominant factor. An example might be Rensselaer, which has an outstanding undergraduate engineering education (perhaps comparable with MIT, etc.) but is located in a less desirable city. Another are the Seven Sister's which have selectivity somewhat lower than their LAC peers, because they accept students from a more limited pool (women who would consider all-women's LACs). In addition, some schools may select students on other criteria (extracurricular leadership, for example) which may not correlate with the highest GPA/SAT scores.</p>

<p>I would be interested in other examples of schools with outstanding education and outcomes, but lowered selectivity - along with the reason(s) for that lowered selectivity.</p>

<p>Iowa has long been considererd a top college for English majors.
Missouri a top college for journalism.
Indiana a strong uni for Business.
Kansas, strong in art.
Kalamazoo, strong producer of kids that go on to PhDs.
Michigan Tech, engineering, overshadowed by UofM
Michigan State, pre-vet
Rose Hulman, engineering</p>

<p>If you accept one of the 1999 findings of Krueger and Dale, future outcomes are dependent on the individual student and not the college attended. The study used the Mellon Foundation “college and beyond” data base which tracked 14,239 adults entering college in 1976. Of course future outcomes were evaluated base on salary statistics, leaving out numerous subjective measures, including some enumerated in the OP. K&D did find that students from disadvantaged backgrounds see measurable benefits in attending colleges perceived as being more prestigious.</p>

<p>I suspect that even those benefits noted in the OP(becoming an educated citizen, finding a life partner, creating a circle of lifetime friends, etc) are mostly dependent on individual characteristics and traits.</p>

<p>Some people state correctly that the typical HYP grad is more “successful” in life than the typical state university grad. What these folks omit is that HYP are overwhelmingly populated by high achieving individuals. What K&D found is that an individual accepted to HYP but choosing to attend a college like Penn State or Denison can expect to see little or no diminution in post grad career success.</p>

<p>originjaloog - very interesting study. It correlates with my personal observation that many business leaders, successful entrepreneurs, and CEOs of large organizations went to colleges that aren’t considered “elite.” They succeeded because of natural ability, drive, and good fortune. </p>

<p>In the professions and academic fields, I would guess this would be the same regarding SALARY. However, prestige and power seems to follow the elite schools more closely (witness the current US Supreme Court). The reason is a heavily trodden path (and preference from those in power) of elite college –> elite law school –> elite clerkship –> elite position. This is the same in medicine, substituting medical school, residency, academic position. The leaders in such fields have power and prestige, but often are less generously compensated than their peers who succeed in private practice, for example. Also, the drive to success that is often required may interfere with some of those other factors, such as friends, family, and happiness.</p>

<p>Caroline Hoxby (formerly Harvard, now Stanford) has studied this issue with results that disagree with Krueger and Dale. </p>

<p>Other criticism of K&D stems from the measure they used: income. OTOH, K&D used the best information they had, since data for things like this (i.e. information on outcomes of kids who were accepted at elites and chose to go elsewhere - consider that elites typically don’t give out such information!) is very hard to come by.</p>

<p>Next, consider that K&D was published about 10 years ago, that its data is even older - studying kids accepted to elites, I’m guessing, about 30 years ago. Higher ed has changed dramatically since then, in many ways. Consider that for many families, the elites are now more affordable than public universities. Consider that higher ed is more stratified than it was in years past. And so forth. </p>

<p>Perhaps a better way to phrase the OP’s question is to ask if anyone has done comparative outcomes research on the order of what the OP asks. I’m not aware of it, and suspect that there would be a great deal of difficulty getting universities to participate, since universities are very reluctant to allow any study that might make them look bad. An example: About 8 years ago in the WSJ I read an article that discussed research being done on alcohol consumption and abuse on college campuses. Thinking this would be relevant to helping pick a college for my D, I contacted the author (a researcher at Harvard School of Public Health) to find out if he had published any data for specific campuses he studied. He told me he had not and could not. In order to get access to any campus, he had promise confidentiality. Still have the emails. Here are quotes:</p>

<p>

</p>

<p>So there you have it. Universities are heavily into message control. This category fits into an arena that universities have more to lose than gain, I fear.</p>

<p>Interestingly, Forbes has tackled this, along with the Center for College Affordability and Productivity. They have created a ranking that (in 2009) is based upon:</p>

<ol>
<li>Listings of Alumni in the 2008 edition of Who’s Who in America (12.5%)</li>
<li>Salaries of Alumni from PayScale.com (12.5%)</li>
<li>Student Evaluations from Ratemyprofessors.com (25%)</li>
<li>Four-Year Graduation Rates (16.66%)</li>
<li>Students Receiving Nationally Competitive Awards (8.33%)</li>
<li>Faculty Receiving Awards for Scholarship and Creative Pursuits (5%)</li>
<li>Four-year Debt Load for Typical Student Borrowers (20%)</li>
</ol>

<p>There is a lot of detail [here[/url</a>]</p>

<p>Most interestingly, you can “create your own” ranking by adjusting the weighting of each of these at this [url=<a href=“What's The Best College For You?”>http://www.forbes.com/2009/08/05/best-colleges-ranking-screener-opinions-colleges-09-tool.html&lt;/a&gt;] Do It Yourself](<a href=“CollegeLifeHelper.com”>CollegeLifeHelper.com) link.</p>

<p>In *Crossing the Finish Line<a href=“2009”>/i</a>, Bowen et. al. examined students at various large publics. They found that for equally qualified students of low income, attending a more selective public (like Berkeley, Virginia or Michigan) made a large difference in graduation rates compared to attending a less selective public.</p>

<p>[Bowen</a>, W.G., Chingos, M.M., McPherson, M.S.: Crossing the Finish Line: Completing College at America’s Public Universities.](<a href=“All Books | Princeton University Press”>Crossing the Finish Line | Princeton University Press)</p>

<p><a href=“%5Burl=http://talk.collegeconfidential.com/1064879852-post1.html]#1[/url]”>quote</a> I would be interested in other examples of schools with outstanding education and outcomes, but lowered selectivity - along with the reason(s) for that lowered selectivity.

[/quote]
</p>

<p>[Harvard</a> Schmarvard: A Small College Shines](<a href=“http://voices.washingtonpost.com/class-struggle/2009/07/harvard_schmarvard_a_small_col.html]Harvard”>http://voices.washingtonpost.com/class-struggle/2009/07/harvard_schmarvard_a_small_col.html)</p>

<p>There was a rather intense study that showed that the colleges attended by successful folks had little to do with their income/status later. It make sense to me. A top student at Harvard or SUNY who does his work conscientiously through college will do well regardless. The number of CEOs and other successful folks and their alma maters show that.</p>

<p>What the study did not measure, and where I believe a top school could make a difference is for those who are not top drawer. Those kids, in particular, can get benefit from a name school and the quality of education, the intensity of those courses, the support and peer pressure at those schools. </p>

<p>I have a friend who made it to an ivy league school through a special program despite having below standard scores and preparation. She struggled through college, but her degree from that school has opened doors for her that would not have happened. She is not top rate at her job, but good enough. Without the degree, she might not have been hired. What happens at the very top or bottom of a curve is not demonstrative for all situations.</p>

<p>St. John’s College: They are a small liberal arts schools with two campuses in Santa Fe and Anapolis. The selectivity is somewhere around 75%. It’s a very rigid, specialized curriculum that focuses on the classics, so I wouldn’t recommend it to everyone, but it is very well-known among grad schools.</p>

<p>Kids who go to flagship state universities that are armed with “reach to the stars” ceilings on the complexity of the academics and the opportunites can do every bit as well as kids at very selective schools. Sometimes even better, as some smaller schools may have ceilings on how far they go in a subject. It’s what you make of it.</p>

<p>There is also a very high success rate in some specialty schools like SUNY Maritime where the selectivity and test scores of the student are not high, but the success rate for those kids is really good in degrees that they might not get elsewhere. Starting salaries at such schools are outstanding, higher than most any college.</p>

<p>I started out at a (small city) Community College due to my circumstances. They were reputed to have a good 2 year engineering science curriculum. </p>

<p>When I transferred to the State U to finish, at our Transfer Orientation for kids in the engineering programs, they said “Where ever you have been, it is going to be harder here.” Those of us from the same CC found this to be false, as the courses were no worse than what we had been in. But State U has a very high opinion of itself, and a low opinion of it’s CC cousins (i.e. the poor relations). </p>

<p>That said, the big city Community College (in the same university system and same city as State U) does tend to be more watered down, and transferring students get a shock when they are faced with the rigor of the State U.</p>

<p>pbleic (and others)</p>

<p>It is relatively easy to look at outputs. That’s what Forbes did. K&D and Hoxley attempted to control for inputs, though. That is much tougher.</p>

<p>One can also debate the relevance of who’s who, self reported salaries from payscale.com, national honors and such for the average student. None of these are at all controlled, random, properly sampled etc. </p>

<p>Anecdotal reports, output only measures, national scholarships and honors and related general “data” and so forth are interesting, but many would agree that these measure tell us more about the students enrolling in a college than about the college itself. </p>

<p>I’ll bow out of the discussion because I doubt anyone reading will be satisfied with the answer “we don’t know.” So debate on!</p>

<p>

</p>

<p>newmassdad, you are absolutely right. The Forbes rankings look at output measures only, and I would argue that some of theiroutput measures are essentially meaningless.</p>

<p>pbl,
I think that this is really very simple. The people determine the outcomes, not the schools that they attend. </p>

<p>Quality In = Quality Out</p>

<p>If you want to be around lots of other, high achieving students, then selectivity is extremely important. But lots of folks coming from less heralded colleges have succeeded big-time. </p>

<p>As for “quality of education” what do you mean by this? What do you think connotes a high quality education?</p>

<p>

</p>

<p>But if you look at Pascarella & Terenzini’s meta-analysis of 30 years of higher ed outcomes research, they show that the best predictor of a graduate’s goals and aspirations is the college peer group. Grads tend to aspire up or down to the perceived norm of their peer group. That’s where selectivity has a huge impact.</p>

<p>Newmassdad - Get a hold of P&T’s latest edition of “How College Affects Students.” If it’s been studied in regard to college outcomes over the past 30 or so years, it’s summarized there.</p>

<p>BTW, NMD, regarding alcohol on campuses - I think you’d find less campus-to-campus variance than you think. The national average of underage students who drink is close to 60%; what some people perceive as a wet or dry campus culture is often just a matter of how public and obvious the drinking is.</p>

<p>Pbleic - My apologies for getting off the topic. I’d point to two colleges that have always impressed me - Carleton and Whitman. They seem to me to be two of the most outstanding undergrad experiences in America, but they aren’t as selective as I’d have guessed. I presume that it’s a matter of geography - being rural, way north, and in parts of the country that aren’t usually college student magnets. Williams is no less rural than either of them, but probably benefits from being in Massachusetts, which is very prominent on college applicant radar.</p>

<p>

I don’t have access to that particular piece of work, but it sounds like you are only describing a correlative relationship and not necessarily a causative one.</p>

<p>^^^ No, that’s after accounting for differing input demographics of the populations that attend different institutions. And I’d guess that most graduates of selective institutions (or institutions with student bodies that reflect a particular peer orientation, such as engineering or military or single-gender) would have a good deal of anecdotal information to support the huge impact of the peer group.</p>

<p>noimagination: First, it is almost impossible to definitively prove causality in this circumstance, as one can’t do a controlled, randomized experiment. However, social science statistical analysis has come a long way in establishing statistical validity for the inference of causality in multivariate data. I am not familiar with Pascarella and Terenzini’s work, but I would not dismiss out of hand with the old argument that “correlation doesn’t prove causality.” Sometimes, it can come darn close, if done right.</p>

<p>That sounds reasonable. The term “predictor” set me off because that is often used to simply describe correlative relationships even if adequate controls are not in place.</p>