CSM: College presidents plan 'U.S. News' rankings boycott

<p>
[quote]
You will find out that 80-90% of the kids like their school and would go again. big whoop. People who have only been to McD's think the food is good too. Most community colleges will even come out with good reviews.

[/quote]
</p>

<p>Yep. I think that is exactly what would happen if you tried to do a "ranking" based entirely on student satisfaction surveys. It's useful information, but it also lacks consistent frame of reference.</p>

<p>barrons,
The disdain you express for the stakeholders is pretty shocking, pretty sad and probably pretty common among those who defend the academic set. For the record, I'd like to remind you that the stakeholders are the ones who pay the bills and they are the ones who actually have to produce something upon graduation. If a family is going to shell out $30-50k a year for a college education, I think they deserve a little more respect...and a lot more accountability.</p>

<p>interesteddad,
I agree that student surveys lack a consistent frame, but the NSSE data that I have seen was pretty interesting (if not widely enough surveyed for use in a USNWR format) and could perhaps serve as a beginning point. And I certainly don't think that a ranking should be done "based entirely on student satisfaction surveys." But I do think that students could play a role as could alumni and recruiters. I (and I believe that the vast majority of business people would agree) have a lot more confidence in their opinions than in the ill-defined Peer Assessment that some point to as an accurate measure of faculty quality.</p>

<p>Colleges are insane to try and kill the US News ranking, which are helping to keep most of them in business. </p>

<p>Objective measures are anathema to most schools, because they tend to deflate the carefully cultivated "brand image". Not that many schools have much to boast about in the way of patent revenues, citation indices, research grants, graduates' salaries after 1 and 10 years (by major), outgoing GRE percentiles minus incoming SAT percentiles, debt burden of graduates, number of paid professor-hours per course per week, square footage per student in dormitories.</p>

<p>The trend toward more, more objective, and more nuanced rankings, is unstoppable. Academia produces hundreds of job-starved statisticians and more data are available every year. Costs cannot rise forever without numerical justification of the absolute and relative benefits. Private colleges should be celebrating that the mushy US News ranking exists with its surveys of "prestige" among college presidents --- the alternative would push most of the clientele into el cheapo state schools.</p>

<p>barrons,
Another thought about your comment above,</p>

<p>"80-90% of the kids like their school and would go again. big whoop. People who have only been to McD's think the food is good too."</p>

<p>You mean like status quo academics not visiting beyond their own personally selected universe? How many Presidents/Provosts/Deans of Admissions at schools in the Northeast have, in the last twelve months, been to visit Rice or USC or U Washington or U Florida or Elon or Colorado College or Wake Forest or BYU or UCLA or UCSD or Emory or Baylor or Caltech or Tulane or....? Maybe we should keep a record of this so we can judge if these opinions are based on current reality rather than outdated historical brands fostered many moons ago. My strong guess is that the Presidents/Provosts/Deans of Admissions haven't been to hardly any of these schools, yet they are being asked to opine on the quality of what is going on at these universities today (not 25 years ago) and across a wide variety of academic disciplines. Frankly, I think several dozen posters on CC could do a more informed assessment. Any family that has recently been on their own school hunting trip can see that there is a GREAT disconnect between the PA scores and what is really going on at many of the lower profile schools.</p>

<p>STOP WRITING SUCH LONG POSTS!! EXPLAIN WHAT YOU WANT IN A FEW SENTENCES....whew sorry abou that..</p>

<p>"My strong guess is that the Presidents/Provosts/Deans of Admissions haven't been to hardly any of these schools, yet they are being asked to opine on the quality of what is going on at these universities today (not 25 years ago) and across a wide variety of academic disciplines."</p>

<p>The above demonstrates a misunderstanding about the world of higher education. Do you really believe that professors and administrators don't interact with those from other universities? Talk to any university president, and you'll discover that s/he knows many other presidents, deans, and faculty members at a wide variety of schools. Professors meet at national - and even international - conferences and sustain those ties. Scientists from different universities collaborate. Why else do you think that the Nobel Prize in, say, Physics is often shared among researchers at different universities? Universities also accept graduate students from a variety of schools; again, they know which colleges best prepare their students for advanced work. The idea that faculty members and academic administrators work in isolation is preposterous. </p>

<p>College presidents are perfectly aware of the quality of students graduating from many universities, locally, nationally, and even internationally.</p>

<p>I find it ironic that the groups in society who most frequently employ rather arbitrary tests of time bound and measurable results are the very ones who most object to such tests being applied to them. What do schools do at all level except test and apply rather arbitrary standards in the measurement of learning? Flip the table and they can tell you everything that is wrong with the categories and tests you are applying to them and why they don't measure anything of "real" significance. Bottom line is school administrators at all levels, teachers and teachers unions, colleges and their professors are the biggest bunch of hypocrits in the world when they object to these rankings. Who is it that invented class rank, grades, standardized curricula that may or may not have meaningful content, SAT/ACT tests etc etc.</p>

<p>Nobody likes to be measured and compared to their peers - especially when they don't stack up all that well. Why are we surprised that these mostly second tier schools are complaining? They don't do well in peer assessments because nobody has ever heard of them. The fact that most of the schools complaining are LACs should not be surprising. By and large LACs produce little research so their profs don't get published and nobody in the academic world associates the name of the school with any research. That makes them academic non-entities, though they may be excellent teaching and learning environments.</p>

<p>


</p>

<p>So a "stakeholder" in higher education is someone who writes a check for someone's tuition payment? I guess I'm about to be a stakeholder, in that case. I'm already doing research about which colleges offer good value. If the taxpayers who support essentially ALL institutions of higher education are included as stakeholders, we are all stakeholders. (There are various colleges that are privately operated, but all but very few of those receive subsidies from taxpayer-funded programs.) </p>

<p>I haven't seen the president of Sarah Lawrence or any other LAC president mentioned in this thread propose a way of looking at colleges that meets stakeholder interests better than the U.S. News ratings. If they think such a view of colleges can be produced, they should produce it. </p>

<p>


</p>

<p>I am smiling in agreement at this description of why college ratings are not going to go away. Yes, literally millions of families consider each year the cost of college (with some concluding that they cannot afford any college), and thousands of colleges consider the issue of how to attract paying students. Too many "stakeholders" need information TODAY about what college choices offer good value to students, and what college marketing strategies promise continued survival for colleges, for the ratings issue to go away. </p>

<p>If someone has a better suggestion in this thread for comparing colleges than the federally gathered IPEDS data, which we can all compare on College Results Online </p>

<p><a href="http://www.collegeresults.org/%5B/url%5D"&gt;http://www.collegeresults.org/&lt;/a> </p>

<p>or the Common Data Set data, which most colleges display on their Web sites and which is the basis for the College Board college search feature </p>

<p><a href="http://apps.collegeboard.com/search/adv_typeofschool.jsp%5B/url%5D"&gt;http://apps.collegeboard.com/search/adv_typeofschool.jsp&lt;/a> </p>

<p>then I would be glad to consider the suggestion. But that a coalition of liberal arts colleges feels its ox is gored when high school students apply to privately operated or state operated research universities is hardly news. If those colleges sincerely believe that they offer better teaching and thus better quality, they should start doing the teaching now by showing all the stakeholders how they can compare colleges on the basis of teaching quality or whatever other desirable characteristics they claim to have.</p>

<p>hawkette:</p>

<p>There is nothing groundbreaking about the NSSE survey. From what I have seen, it is largely similar to the existing national survey instruments purchased by the members of the COFHE consortium and other college associatings around the country.</p>

<p>
[quote]
But I do think that students could play a role as could alumni and recruiters. I (and I believe that the vast majority of business people would agree) have a lot more confidence in their opinions than in the ill-defined Peer Assessment that some point to as an accurate measure of faculty quality.

[/quote]
</p>

<p>You want business recruiters to rank colleges? Why just business? That makes no sense to me whatsover. To be honest, I think the over-emphasis on business recruiting undermines the real purpose of a collge eduction -- learning how to think and communicate. I'd rather stick with a less-than-ideal Peer Assessment survey. At least the people answering the questions have the first clue about colleges and universities.</p>

<p>If you want an outcomes based component, it is not difficult. Per capita PhD, MD, and Law production. The PhD data is already collected and made public...and has been since the 1920s. The MD and Law School admissions organizations have the ability to collect that data. I probably would not include MBAs in this data simply because such high percentages of MBAs are later in life and unrelated to the undergrad college. I think advanced degree attainment is an essential piece of data for understanding the differences in colleges. It's one of the missing pieces in the USNEWS methodology.</p>

<p>BTW, I would also advocate ranking Public Universities separately from private universities just as LACs are broken out separately.</p>

<p>Interesteddad, do you have the md numbers? Both per capita and total numbers?</p>

<p>I want to be a doctor. I'm choosing between Smith and Swat. Are my chances of being a doctor different depending on the school I choose? How about Earlham and Swat? And if my chances are different what are the causes and effects?</p>

<p>I know you love percentages. Because I can be sloppy, I have Tuesday's Marin IJ newspaper on the floor next to me. The headline is "Mortgage defaults in Marin jump 55%". Looks like a very big number, 55% increase. But when you really look at the numbers, we are looking at 118 defaults out of 100,000 households.
If the paper used numbers instead of percentages, the whole meaning would have been different. Percentages can be misleading. </p>

<p>Works the same when examining schools. ;)</p>

<p>I guess if you compare similar size schools with similar student bodies and people with similar interests, the percentages may be helpful. </p>

<p>Are students that come from families in the top 10% of wealth or income more likely to become doctors, lawyers, or get their phds? How does a person's economic status affect their future outcomes?</p>

<p>When we look at numbers depicting student outcomes, are we looking at causation or correlation?</p>

<p>


</p>

<p>Momwaitingfornew, is there a difference between the "misunderstanding" of posters who have a different opinion of the validity of the peer assessment or the knowledge of the person who happens to, in fact, fill the form and your own idle speculation?</p>

<p>Aren't you continuing to ignore the precise words of the president behind the "revolt?" From the article quoted by the OP:</p>

<p>
[quote]
Several college presidents suggested that they personally could evaluate only five to 10 schools – a far cry from the hundreds on the list. "We know each other through reputation, but that's different than having the kind of intimate knowledge you should have when you are making a ranking," says Robert Weisbuch, president of Drew University in Madison, N.J., who plans to sign the letter.

[/quote]
</p>

<p>Isn't that why the famous Thacker letter does precisely asks for the "others" to stop answering the REPUTATIONAL survey?</p>

<p>momwaitingfornews,
Attending conferences is not an effective way to measure what is going on at a college. I find this analogous to attending a college fair and making judgments about colleges all over the country. Or a bunch of cardiologists attending the ACC meetings and making judgments of various hospitals all over the country with regard to their overall health and hospital services. Do you really think that a cardiologist attending from Seattle General Hospital is going to be able to accurately understand and opine on the various medical departments at Duke Medical Center just because the cardiologist from U Washington and the cardiologist from Duke were in the same place? Of course not. And this is no more true for the U Washington undergrad professor being able to judge what is going on at Duke undergrad across all academic fields if they both attend the same conference in New Orleans. Attending a conference, whether it be for colleges or medicine or practically any field is not going to give you the insight and familiarity required to make a proper judgment. If you believe that it is, then you and I have very different standards for what qualifies as a useful opinion. </p>

<p>higherlead,
For your comment,</p>

<p>“By and large LACs produce little research so their profs don't get published and nobody in the academic world associates the name of the school with any research. That makes them academic non-entities, though they may be excellent teaching and learning environments.”</p>

<p>I completely agree. This is what causes the sometimes large and unjust numbers in Peer Assessment. It is a rigged game set up to perpetuate the status quo. As a student, I would care greatly about “excellent teaching and learning environments.” The part I can’t understand is why the Peer Assessment does not appear to be concerned with and rewarding the same thing. </p>

<p>tokenadult,
Take the stakeholder comment and apply it to any product you buy. Is it even conceivable that the consumer is not consulted in giving an opinion on virtually any product they buy-electronics, automobile, apparel, any type of service, etc. In every case, consumers have a large say in the evaluation of the product. Why should it be any different for college education? Students and families spend large amounts of money. Alumni want to see the value of their degree boosted over time. Recruiters want to get students who are well prepared. These folks have every bit, if not more, interest in the quality of the education and the output than do faculty members. </p>

<p>Interesteddad,
I agree that the NSSE data is not groundbreaking. It is also not included in the USNWR survey. My point is that it or something like it which reflects the input of students would be very valuable to me if I were searching for a school and really wanted to understand what consumers thought of the product. Right now, this perspective is missing in the reputational score of a college. The only opinions given are those of academics (but we don’t know who they are nor what they said or why they said it). </p>

<p>As for who should rank colleges, why not businesses? I don’t know the numbers offhand, but the vast majority of college graduates go into the for-profit world. Understanding how students think their school prepared them and understanding how businesses perceive how well prepared the students are would be of immense value. If you are a student and go to a college that has a faculty that couldn’t give a hoot about you and were focused on research and did little to prepare you for the real world, then this should be reflected somewhere. </p>

<p>For those who are interested in graduate study (which is an important, but still relatively low, percentage of the graduating universe), I agree that this is a worthwhile consideration and believe that it should be included in output measurements. I also agree that the MBA numbers have low value because of the time lag between college graduation and MBA matriculation. This same trend is happening to a lesser degree at Law schools and even some Medical schools.</p>

<p><a href="http://www.collegeresults.org/search1a.aspx?InstitutionID=195304%5B/url%5D"&gt;http://www.collegeresults.org/search1a.aspx?InstitutionID=195304&lt;/a&gt;&lt;/p>

<p>I'd like to see this study rerun with a bigger data set. </p>

<p><a href="http://www.economics.harvard.edu/faculty/hoxby/papers/revealedprefranking.pdf%5B/url%5D"&gt;http://www.economics.harvard.edu/faculty/hoxby/papers/revealedprefranking.pdf&lt;/a> </p>

<p>I am curious about the current opinions of stakeholders.</p>

<p>
[quote]
Are my chances of being a doctor different depending on the school I choose?

[/quote]
</p>

<p>IMO, no college statistics (in any category) predict individual results. Students who think they do are making a huge error in logic. The fact that 100% of Swarthmore seniors applying to med school got accepted last year doesn't mean that you will be next year. </p>

<p>I would not say that a random freshman entering Swarthmore has higher odds of going to medical school than a random freshman entering Smith. Heck, we don't even know if those individual students can pass orgo. The value in looking at, for example, med school placement rates of various schools is that those rates contribute to an understanding of the nature of that school in the same way that a college producing a lot of auto mechanics would tell us something about the nature of the school.</p>

<p>
[quote]
When we look at numbers depicting student outcomes, are we looking at causation or correlation?

[/quote]
</p>

<p>The two are inseparable. When you choose a college, you are buying the quality the school provides and the quality of the peer group. In other words, having "better" students is inextricably a part of being a "better" school.</p>

<p>BTW, your small number argument on percentages is valid...when the numbers are small. However, the numbers of college grads going on to get PhDs is not small. One out of every twenty graduates got a PhD at more than 100 colleges and universities. One out of every one hundred graduates at more than 1000 colleges and universities.</p>

<p>
[quote]
I also agree that the MBA numbers have low value because of the time lag between college graduation and MBA matriculation. This same trend is happening to a lesser degree at Law schools and even some Medical schools.

[/quote]
</p>

<p>It's not just the time lag. It's the high percentage of MBA's nationally that are essentially mid-career continuing education degrees. Look at the huge numbers of MBAs from schools like Georgia State in Atlanta. I just think that MBAs would be an extremely difficult statistic to track.</p>

<p>One out of twenty graduates got Phds at more than 100 schools? That does sound like a high number to me.</p>

<p>"The two are inseparable. When you choose a college, you are buying the quality the school provides and the quality of the peer group. In other words, having "better" students is inextricably a part of being a "better" school."</p>

<p>I'm going to disagree on this. Causation and correlation aren't inseparable. </p>

<p>Two schools. One has 200 top students. The other school has 200 top students and 200 above average students. I have a choice of schools and I am going to be one of the top 200 students. Is the former school better than the latter?</p>

<p>There is this idea that students who are not as smart as "me" bring the educational level down. </p>

<p>I have a friend who was a Straight A student at UCLA and then was the number 1 student at her law school. I asked her this question and she laughed.</p>

<p>"I would not say that a random freshman entering Swarthmore has higher odds of going to medical school than a random freshman entering Smith. Heck, we don't even know if those individual students can pass orgo. The value in looking at, for example, med school placement rates of various schools is that those rates contribute to an understanding of the nature of that school in the same way that a college producing a lot of auto mechanics would tell us something about the nature of the school."</p>

<p>I want to careful here. :) I agree with you. My problems come in when one place is different than another; therefore, one has to be better than another. And we can rank these differences and come up with a very precise ranking of which schools are better than others. A one shoe fits everyone kind of ranking. Then we can take this ranking and say if you go to a higher ranked school, you go to a better school. And if you go to a better school, you get a better education than somebody who goes to a lower ranked school. You are better educated now and somehow more desirable to graduate programs, professional programs, the workforce, and the opposite sex. </p>

<p>For some people SWAT might be one of the best schools, for others Princeton, for others UVA, or U of Miami, or...</p>

<p>A ranking of all the schools for all students 1,2,3,....123 is ridiculous.</p>

<p>By the way, we both know Reed's ranking is ludicrous. ;)</p>

<p>
[quote]
Two schools. One has 200 top students. The other school has 200 top students and 200 above average students. I have a choice of schools and I am going to be one of the top 200 students. Is the former school better than the latter?

[/quote]
</p>

<p>That depends on how much the dilution of resources to accomodate the doubled enrollment undermines the quality of the undergrad education for all. For example, if the student/faculty ratio is doubled to accomodate the additional enrollment, then your "top" 200 students would probably (on average) be better off elsewhere.</p>

<p>By the way, note that I was careful to use quotation marks for "better" school. I do not believe in absolute rankings of schools whatsoever beyond obvious broad tiers.</p>

<p>"By the way, note that I was careful to use quotation marks for "better" school. I do not believe in absolute rankings of schools whatsoever beyond obvious broad tiers."</p>

<p>I know.</p>

<p>If there were a way to utilize the "Revealed Preferences" data that tokenadult links above, that might also be a useful piece of data for use in a ranking system. By voting with their feet, it is a pretty clear indication of how students feel about these schools PRIOR to matriculation. It'd be interesting to see how these opinions compared to current students or graduating students and also the opinions of employers and graduate schools.</p>