<p>
[quote]
influence the college decision making of thousands of high school students...
[/quote]
</p>
<p>sorry, but what little data there is indicates otherwise -- when surveyed, kids rank rankings as #10 college criteria.</p>
<p>
[quote]
influence the college decision making of thousands of high school students...
[/quote]
</p>
<p>sorry, but what little data there is indicates otherwise -- when surveyed, kids rank rankings as #10 college criteria.</p>
<p>Then post on the other thread where the comments are. Also, if you don't like what I'm posting, maybe you could give me a little more guidance or insight into what you are criticizing. I think I have pretty consistently promoted the idea that there are a ton of good colleges around the country and that there are great students to be found in many, many places.</p>
<p>There are rankings that are all fact based--but people don't like those either.</p>
<p><a href="http://mup.asu.edu/research.html%5B/url%5D">http://mup.asu.edu/research.html</a></p>
<p><a href="http://ed.sjtu.edu.cn/ranking2006.htm%5B/url%5D">http://ed.sjtu.edu.cn/ranking2006.htm</a></p>
<p>Or the purely subjective/informed rankings</p>
<p>It also doesn't matter what I think. </p>
<p>Let me just say that I find the argument that you can get a great education at tons of schools including schools that aren't on the radar inconsistent with the arguments using "objective data" and "lists".</p>
<p>But it doesn't matter what I think. :)</p>
<p>The best ones (for the so-called prestige schools) reflect the actual perceived experience of the majority of students at each of 31 schools, namely, the COFHE Survey. </p>
<p>But the schools ain't talking (though there have been some leaks...<a href="http://www.boston.com/news/education/higher/articles/2005/03/29/student_life_at_harvard_lags_peer_schools_poll_finds?pg=full%5B/url%5D">http://www.boston.com/news/education/higher/articles/2005/03/29/student_life_at_harvard_lags_peer_schools_poll_finds?pg=full</a>)</p>
<p>
[quote]
The best ones (for the so-called prestige schools) reflect the actual perceived experience of the majority of students at each of 31 schools, namely, the COFHE Survey
[/quote]
</p>
<p>Yep . . . and it's interesting how poorly COFHE, HEDS, or NSSE data correlate with the "objective" rankings of overall educational quality in U. S. News.</p>
<p>
[quote]
There are rankings that are all fact based--but people don't like those either.
<a href="http://mup.asu.edu/research.html%5B/url%5D">http://mup.asu.edu/research.html</a>
<a href="http://ed.sjtu.edu.cn/ranking2006.htm%5B/url%5D">http://ed.sjtu.edu.cn/ranking2006.htm</a>
[/quote]
</p>
<p>And how relevant are those for UNDERGRADUATE schools? The focus on the first is on research universities, and the second one is on graduate schools. </p>
<p>And, what is there to say about a ranking that would be based on the COFHE reports. The COFHE has only ONE objective: provide comparative data to the schools that belong in the group. It serves no purpose whatsover for a usage that falls outside that small circle.</p>
<p>While it should be interesting -mostly for entertainment value- to read more about the result of COFHE, one cannot ignore the methodology and EXTREME limitations of the COFHE. Even with the very small number of participants, there is ZERO uniformity in how the data is collected. Some schools bribe their students to answer the survey; others consider it a nuisance. Students think the survey is a joke and respond accordingly. </p>
<p>Mini knows that very well, but he has way too much fun quoting that single source of information (Boston.com) to spend some time researching the COFHE organization! :(</p>
<p>Actually, I think the various COFHE survey instruments are very useful to the schools that use them. However, they are not intended to form the basis for a college ranking system. The main purpose is for a school to learn more about its own campus. For example Swarthmore used its 2006 COFHE Senior Survey to look at how its students spend their time and specifically on the hours spent working in academics, relative to faculty expectations of workload. Interesting report that had nothing to do with any other COFHE school.</p>
<p>BTW, as long as we insist on a ranking system, I think we have to rely heavily on some measure of "conventional wisdom" such as the Peer Assessement fudge factor. Otherwise, your datapoints would be all over the board and you'd have a ranking system that makes no sense whatsoever. Think of PA as "fitting the curve" or plotting trendlines in an Excel chart.</p>
<p>If you wanted to do away with PA, you could replace it with Per Student Endowment and produce nearly the same ranking list. The problem? Universities would have to be honest in apportioning the percentage of their endowment (or spending) attributible to undergrads. The schools are not about to do that...they want to bury the undergrad stuff under a mountain of grad school and research spending, just like they use those tricks to jimmy their student/faculty numbers.</p>
<p>Anyone reading mini's Boston Globe link is probably struck by the Harvard dean talking about the 11-1 student/faculty ratio. Hmmm....there's a little discrepancy with the 7-1 figure they report to USNEWS. I guess I wouldn't publish my Common Data Set either.</p>
<p>USNEWS lets you count professors who have a full-time release from teaching for their research. The Harvard dean was probably talking about the real ratio -- of professors who actually teach courses.</p>
<p>barrons,
For the “objective” surveys that you reference above, it would be nice if these dealt with facts that were actually relevant. In looking at the underlying methodology of each survey, one quickly and easily sees that they measure many factors that are irrelevant to the vast majority of college students. </p>
<p>The first survey, produced by The Center for Measuring University Performance, has nine criteria that it looks at. It is interesting to note that of these nine criteria, two are also considered by USNWR and are often criticized by partisans of large state public universities. </p>
<ol>
<li>Total Research</li>
<li>Federal Research
3, National Academy Members</li>
<li>Faculty Awards</li>
<li>Doctorates Granted</li>
<li>Postdoctoral Appointees</li>
<li>Endowment Assets</li>
</ol>
<p>Also part of the USNWR ranking system
8. Annual Giving
9. Median SAT Scores</p>
<p>I believe that each factor is given equal weight. For a student attending a college that is not research-driven, many, even most, of these factors have very little relevance. And for the student who does attend a research-driven university, but decides to study English or history or a language or something in a non-technical area, what relevance does this data have to their experience? </p>
<p>The second survey, The Academic Ranking of World Universities published by the Institute of Higher Education, Shanghai Jiao Tong University, has six factors. They are </p>
<p>10% The total number of alumni of an institution winning Nobel Prizes and Fields Medals</p>
<p>20% The total number of the staff of an institution winning Nobel prizes in physics, chemistry, medicine and economics and Fields Medal in Mathematics.</p>
<p>20% The number of highly cited researchers in broad subject categories in life sciences, medicine, physical sciences, engineering and social sciences</p>
<p>20% The number of articles published in Nature and Science between 2001 and 2005</p>
<p>20% Total number of articles indexed in Science Citation Index-expanded and Social Science Citation Index in 2005</p>
<p>10% The weighted scores of the above five indicators divided by the number of full-time equivalent academic staff</p>
<p>Also, as Xiggi points out, this survey is not an undergraduate assessment. </p>
<p>For a student that wants to pursue a career in academia, this criteria may make sense. It might also have some application to the science or technically-oriented student. But to the vast majority of the rest of the students, the above is pretty close to meaningless. </p>
<p>interesteddad,
I have really enjoyed your posts and I have learned a lot from your writings, but I don’t understand your seeming acceptance above of PA scores as they are so clearly flawed. Do you really think the USNWR ex-PA rankings reach incorrect conclusions? Can you give me and others a little more insight into your thinking on this and what are some examples of misrankings that would result from ex-PA scoring?</p>
<p>"Actually, I think the various COFHE survey instruments are very useful to the schools that use them. However, they are not intended to form the basis for a college ranking system."</p>
<p>It is the most accurate peer assessment tool out there, as it has each student (more than 50% at each school) assessing the quality of their own school. And it WAS intended as a college ranking system, as each of the school's does it to see how they are doing relative to their peers. Otherwise the scores on measurement of perception of academic quality and quality of campus life are absolutely meaningless, except in looking at 5-year changes.</p>
<p>There are sub-purposes of course that each school makes use of (such as student self-assessment - I have heard, but cannot confirm, for example, that my d's school scored highest of all 31 on self-assessment of writing skill, and lowest, or next to lowest, on mathematical self-assessment, which has caused much discussion about faculty.)</p>
<p>"Even with the very small number of participants, there is ZERO uniformity in how the data is collected. Some schools bribe their students to answer the survey; others consider it a nuisance. Students think the survey is a joke and respond accordingly."</p>
<p>More than 50% participation at every school surveyed, and like all surveys, there is absolutely no reason to assume that the level of seriousness is different at one school as opposed to another. And that's why the schools believe in the reliability of the comparative data (as you yourself wrote below).</p>
<p>But, unlike ID, you did get it right:</p>
<p>"The COFHE has only ONE objective: provide comparative data to the schools that belong in the group. It serves no purpose whatsover for a usage that falls outside that small circle."</p>
<p>
[quote]
I have really enjoyed your posts and I have learned a lot from your writings, but I don’t understand your seeming acceptance above of PA scores as they are so clearly flawed. Do you really think the USNWR ex-PA rankings reach incorrect conclusions? Can you give me and others a little more insight into your thinking on this and what are some examples of misrankings that would result from ex-PA scoring?
[/quote]
</p>
<p>I'm not saying that I agree with every last detail in the Peer Assessment or that it isn't flawed. I'm just saying that without some measure of "conventional wisdom", any numerically driven ranking would likely make little or no sense viz-a-viz "conventional wisdom". You get too many flukes like a conservatory with 50 students topping the per student endowment lists or some TV preacher seminary topping the doctorate productions lists because of all the mail-order divinity degrees. I see these anomolies every time I do a statistical "ranking". Any meaningful consumer ranking has to have some method of filtering anomolies to produce the "desired" result. That's what the PA does, for better or worse.</p>
<p>I use the USNEWS rankings as a very rough guide. The sortable fields, however, are VERY useful. I figure it's a case of caveat emptor. If the consumer wants to use the rankings in a simplistic, mindless way....so what?</p>
<p>My beef is the misleading nature of some of the information published due to flawed methods of collecting things like faculty-student ratios, class sizes, etc. Some of these items on the Common Data Set are clearly designed to obfuscate the differences between schools, as is not having a measure of TA teaching, not having a reliable consistent measure of financial resources, etc.</p>
<p>
[quote]
And it WAS intended as a college ranking system, as each of the school's does it to see how they are doing relative to their peers.
[/quote]
</p>
<p>To a group of peers. I don't think anyone in the COFHE group looks at the data as a precise ranking in the sense of ordering schools from 1 to 31. As you point out, some schools measure up well in one area, weaker in another. The evaluation of this data is far more complicated than a simple ranking....as is choosing colleges, I might add!</p>
<p>I have no problem with the way the COFHE data is collected. The sample sizes are huge, especially for things like the annual Senior Survey.</p>
<p>
[quote]
There are sub-purposes of course that each school makes use of (such as student self-assessment - I have heard, but cannot confirm, for example, that my d's school scored highest of all 31 on self-assessment of writing skill, and lowest, or next to lowest, on mathematical self-assessment...
[/quote]
</p>
<p>Neither of which tells you anything useful about how the school really stacks up against the COFHE universe in either category.</p>
<p>For comparing major research universities they shed as much light as anything else out there. But see, we don't like these facts so out with it. I would say that having faculty doing significant research certainly benefits undergrads as it correlates with the top people in the field. In my experience they are as likely to be good teachers as any LAC prof. In fact a good number started their careers at an LAC. Being a member of the NAS or winning major awards such as Guggenheims which allow for an extended period of focused study are not bad for undergrads. Again it means your peers see you as among the very best as only a relative handful of these go to any school in a year. The idea that a top researcher is a poor teacher is just LAC biased BS. Some are better than others but many more were excellent than bad in my experience.<br>
Research dollars are a fine proxie for good faculty doing good work in good facilities. Interested undergrads can often get involved if they show some interest. Also about 35% of the money goes to the university for overhead and funds many of the costs of running the libraries, computer networks, lab equipment and paying the adminstrative staff. That frees up money to be spent on students directly. A couple of others such as post-docs and PHd's granted are marginally useful but a steady flow of grad students and post docs keeps the intellectual activity humming around campus. Any undergrad can sit and have a chat with a grad student or even a post doc. That can be more stimulating than talking to some other 19 year old. </p>
<p>Here's a link to this years Teacher of the year award winners at a major big state research U. Some amazing people. </p>
<p><a href="http://www.news.wisc.edu/13685.html%5B/url%5D">http://www.news.wisc.edu/13685.html</a></p>
<p>As to the small schools without much research and no grad students--you feel free to figure out your own system.</p>
<p>Are all of the boycotting colleges (and just how many signers have there been, so far?) going to mutually agree that they are equally good (and better than nonboycotting colleges), or are they going to come up with some more suitable rating scheme?</p>
<p>"Neither of which tells you anything useful about how the school really stacks up against the COFHE universe in either category."</p>
<p>No, what it does it provide student assessments as to how they stack up. The students use an arbitrary rating scale - 1-5 -- to provide a rating of, for example, experienced academic quality and quality of campus living. The stats by themselves (like the USNWR peer assessments) mean nothing. They only have meaning in the same way the USNWR peer assessments, by comparison with others doing a similar assessment, and year-over-year changes (or, in the case of COFHE, 5-year changes.) Now, of course, the 5-year comparisons could be "equally meaningless" - as it is a different group of students doing the assessment each time.</p>
<p>But it is true - as you have well recognized for Swarthmore, the cited article article for H. does for them, and I can for my d's school - that the schools themselves take these comparative ratings extremely seriously, and have actually instituted changes (sometimes expensive ones) based on them. So I can't see why we would choose to diss a comparative ratings system that we know with certainty that 1) the schools choose, repeatedly, to pay for, and 2) we know that at least some of the schools pay very close attention to.</p>
<p>But you are correct when you note that the precise ordinal ranking is not very meaningful to the schools, even though the relative ranking, through the 31, seems to be taken very seriously.</p>
<p>Mini, have you compared the results of Smith (only accessible to Smith's insiders) with other COFHE surveys that have been made public?</p>
<p>Xiggi:</p>
<p>I've never seen enough COFHE survey data to make even the most cursory comparison. What I have seen in reports shows the difficulty in using this data in any precise sense. The following paragraph from Swarthmore's 1999 Re-accreditation self study highlights the challenge:</p>
<p>
[quote]
The regularly-collected COFHE data have been helpful in confirming our understanding of how successfully the advising process operates at Swarthmore. On the positive side, Swarthmore's students are more satisfied with the advising system here than students at any of the COFHE schools; comparatively, Swarthmore is doing well. On the negative side, the same COFHE data show that Swarthmore students report feeling more dissatisfied with the advising situation than they are with almost anything else at the College. Although 66% of Swarthmore seniors indicated they were "satisfied" or "very satisfied," this was nineteenth out of twenty-six categories.
[/quote]
</p>
<p>Swarthmore interpreted the data to motivate rather sweeping changes to improve first year advising, including the implementation of first-year seminars capped at 12 students (to get freshmen connected with faculty right out of the gate) and to assign initial faculty advisors based on potential majors from the applications and to create a system of Student Academic Advisors assigned to each incoming freshman. But, as they point out, it's rather inconclusive when your students say that advising is one of your weaknesses while at the same time scoring advising as the best of the 31 COFHE schools!</p>
<p>This is exactly why I have my doubts that any of these student surveys would provide a meaningful improvement over USNEWS for the type of casual consumer that is most likely to misuse USNEWS by failing to drill deep enough into the underlying data. If consumers don't even drill down to sort data by USNEWS' crude categories, how are they going to gleen anything useful from 26 discrete areas of student statisfaction? Especially when the results vary by tenths of a point from school to school. You are right back to the problem of communicating more precision than the data warrants (i.e. the difference between 12th and 13th place). I could already point to a dozen readily available statistical measures that, taken as a whole, provide a very good snapshot of the differences between seemingly similar schools. But, consumers who rely on USNEWS for the sole purpose of seeing whether Claremont McKenna or Harvey Mudd is "ranked higher" (an utterly meaningless excercise!) aren't going to bother aggregating and weighting different datasets.</p>
<p>These surveys work for internal college purposes fundamentally because colleges are always looking to identify something to improve. It's kind of the nature of the academic beast to have perpertual study groups making recommendations. The surveys at least help ensure that these perpetual study groups waste their time on something useful!</p>
<p>
[quote]
These surveys work for internal college purposes fundamentally because colleges are always looking to identify something to improve.
[/quote]
</p>
<p>That's very much what I was told, as well.</p>
<p>barrons,
My argument against the surveys that you present is that they are almost entirely composed of data that is irrelevant to large numbers of undergraduate students. It is not that the data are not useful as a measurement of the research acumen of a particular institution. If folks in this area of a college and their students agree that this information is valid and relevant, then go for it. But I strongly reject that the research reputation is what should drive the entire reputation of a school. Most colleges in American DON'T have this as their mission. Even so, I believe strongly that there should be much more to faculty evaluation than how many articles are published, how many awards are won, etc. </p>
<p>I have raised several times the idea that other stakeholders should have a role in judging the academic strength of a college. Many would say that there is value in measuring the outputs of a college. Who better to judge this than the students themselves (how well did the college prepare them for the "real world?") as well as alumni (how well did the school prepare them for their careers and what is the value of the degree in the "real world?") and recruiters (how prepared are these students from X school?). If a professor knew that he/she would be judged on how his/her students view the class or how the students do after college, I suspect that he/she would significantly change his/her approach and be a lot more responsive to the needs of the stakeholders and not just think about the next research project or the next paper for Nature magazine that they are going to write.</p>
<p>What is not relevant about having someone who is good enough to be contributing new knowledge to a field rather than reciting somebody else's work? At Wisconsin you could have had Harry Harlow himself teaching you intro to psychology and exp psych rather than read about his work in a book. You could have had Crow teach you intro to Genetics from the book notes he was working on that became the standard text. UW publishes all their NSSE data and student job placement---unlike most private schools.</p>
<p>All that process speak about stakeholders etc makes me ill. You will find out that 80-90% of the kids like their school and would go again. big whoop. People who have only been to McD's think the food is good too. Most community colleges will even come out with good reviews.</p>