Princeton Review 'Survey Says...' Accuracy

<p>"So sad to hear smoking still prevalent for many youngsters." Not here in Calif! We have the lowest smoking rate per capita in the US, so if being around smokers is a no-no, then check out Calif Schools. But it would help us if you give us more details regarding the kind of "atmosphere" or educational environment your S is/isn't interested in.</p>

<p>"voluntary response data are worthless"</p>

<p>I don't believe that's true where, as here, a ranking in the top 20 "lots of hard liquor" schools or whatever reflects the near-unanimous opinion of the student body. It really doesn't matter sample you take of the students at Oberlin; they're all going to agree that's it's a liberal, tree-hugger school, just as everyone at Alabama agrees that the Greek system is prominent on campus and everyone at W & L agrees that a lot of drinking goes on. These cultural patterns are no secret, even among students who are themselves exceptions to the rule.</p>

<p>


</p>

<p>Not true. The US News Peer Assessment is based on a survey of (currently) 4,272 presidents, provosts, and deans of admissions of U.S. colleges and universities, selected by a professional survey research firm, who are mailed identical questionnaires. The response rate of 46% may seem low, but is actually quite high for a survey conducted by mail. (Standard public opinion surveys conducted by telephone typically get only about a 25% to 35% response rate due to inability to make contact with all households in the sample despite repeated efforts, coupled with refusal to participate by a significant fraction of those who are contacted; for mail surveys, response rates of 10% to 30% are not uncommon). </p>

<p>This is not the same as "voluntary response data" which by the most common definition is data collected from self-selected respondents to a general appeal. An example of the latter is the solicitations some of the TV news networks make following presidential debates, inviting viewers to phone in or e-mail their choice as to who "won" the debate. Those sorts of open-ended appeals typically draw more responses from groups with the most intense feelings and are widely viewed as unreliable. But that's a very different thing than the kind of targeted and controlled survey US News does in its PA rating.</p>

<p>As for a self-interest bias on the part of PA survey respondents, no doubt it's there, but in principle the self-interest biases of all respondents should roughly cancel each other out. The reason they don't is that most college and university presidents, provosts, and deans of admissions realy DO think HYPSM + are the nation's premier educational institutions; and there is widespread though not universal agreement on a group of schools right behind them, and so on. Undoubtedly the further down the pecking order you go, the less reliable the information becomes as college and university administrators can't keep fully informed on what's going on at all of the nation's 4,000 or so colleges and universities. But after all, the PA purports to be nothing more than what it says it is---a survey of how various colleges and universities are REGARDED by their peers. Those that are not on the radar screen are . . . well, not on the radar screen, and that's more or less accurately reflected in the survey. Everyone keeps an eagle eye on those they think are on top, and that's more or less accurately reflected in the survey, too.</p>

<p>I'm not saying the US News PA score is a perfect measure of how colleges and universities are perceived by their peers. College presidents and provosts are supposed to keep an eye on their peers, because if they don't, they can't really know how their own school is doing and therefore can't do their jobs; but why deans of admissions are included, I'll never know. Personally I put much more stock in the more fine-grained department-by-department rankings in the NRC surveys of academics in the same field. But these have serious limitations, too, not least that the most recent one currently available was issued in 1995, and it includes only schools with graduate programs. But notice it's exactly the same kind of survey, and like any survey, undoubtedly has substantially less than a 100% response rate.</p>

<p>"Not true. The US News Peer Assessment is based on a survey of (currently) 4,272 presidents, provosts, and deans of admissions of U.S. colleges and universities, selected by a professional survey research firm, who are mailed identical questionnaires. The response rate of 46% may seem low, but is actually quite high for a survey conducted by mail. (Standard public opinion surveys conducted by telephone typically get only about a 25% to 35% response rate due to inability to make contact with all households in the sample despite repeated efforts, coupled with refusal to participate by a significant fraction of those who are contacted; for mail surveys, response rates of 10% to 30% are not uncommon). </p>

<p>This is not the same as "voluntary response data" which by the most common definition is data collected from self-selected respondents to a general appeal."</p>

<p>Actually it is EXACTLY the same. When Wechsler and others do campus alcohol and drug surveys (actually, the colleges themselves do them), the responses are all voluntary, but the response rates have to be above 55%. All of these students are answering questions about their own campus and their own behavior. </p>

<p>In the case of the USNWR survey, the sole data response is voluntary, and based on the "impressions" of college presidents, deans, and provosts about colleges they may have never seen - in fact, they may never have even have been in the state, or even the part of the country where the college is located. Their sole contact with the college may have been a colleague they met 30 years ago. They may know absolutely how much or how little a college has changed in that time period. Then USNWR goes and reifies this all into numbers, and into a ranking system of prejudices that may or may not have any real basis in reality. We know, for example, that in its own internal survey (the COFHE survey), Harvard ranks no better than 26 among 31 private prestige colleges in its students' own assessments of academic quality and quality of campus life. So should H ranked 1 or 26? (or lower, since no public universities are included in the survey, and 26th is, after all, pretty darn good). There doesn't seem to be any conversation about same, and it is because of the reification or prejudice that USNWR's peer assessment mechanism represents.</p>

<p>Thanks again all for the ongoing valuable feedback. According to the numbers, I would have some reasonable confidence in data, given the relatively high response rate as noted above.</p>

<p>Menloparkmum: California is an option I am trying to sell to my boy. He is now in a boarding school in MA that was shutdown early by an ice-storm before Christmas! If it was me, I would head for warmer areas. He is definitely interested in a liberal arts type environment, not strict conservatory (see below), but does not want to be dodging smoke and drinking binges. Even given thought to Christian run schools, although the boy like me has no affiliation or desire to change. So anywhere with strong pushy religious needs are out too. :)</p>

<p>I have posted some queries in the music-major forum as my boy is looking for BM Composition to go with his UK-based ABRSM diploma in piano performance. Sadly, the awareness of US towards UK-based ABRSM accomplishments is low, despite the equivalent degree level LRSM performance exam he is taking this year. :( The US doesn't have anything similar it seems for comparison. I will post another query on this topic in the music forum. But if anyone here knows a 'cleanish' college with strong music composition (Oberlin for eg?) please let me know.</p>

<p>Ok, so is he ONLY interested in getting a Music degree/ BM in composition? Or is he interested in other areas? You may want to have him check out USC, University of Southern Calif which has a fine music program[ Thorton School of Music].USC</a> Thornton School of Music
USC has recently been attracting a lot of top quality students [ partly because of their merit scholarships] but one of the reasons it was on my son's list, and is the college he ended up deciding to go to was their Rennisance Scholars program, which encourages students to major in 2 diverse areas[ music and Geology, for him] The school is big enough that students can find lots of others who share their interests [ like not getting drunk on weekends], though there are those that like to party hard at USC. My son and many of friends from the Music and Engineering schools like to do other things than drink with their spare time. And today it was 80 degrees in LA! Flip flop and shorts weather already!</p>

<p>Shepherd school of music at Rice?</p>

<p>^^ another wonderful music program at a great smaller U!</p>

<p>The historical example of the Literary Digest poll </p>

<p>Landon</a> in a Landslide: The Poll That Changed Polling </p>

<p>Literary</a> digest poll </p>

<p>reminds us that response rate and sample size are NOT enough to consider a survey result reliable. That poll had a famously incorrect result, even though the sample size was huge and the response rate higher than any ever seen for a college guidebook survey.</p>

<p>Well, all I can say about the poll results on Princeton Review is that they can be a good warning. My school ranks on the Reefer Madness list, and I figured it really wouldn't be a big deal and I wouldn't really notice it just like I didn't in my high school. I didn't account for the fact that I'd be living with these people, and aside from that, the pot use is pretty blatant otherwise. How distracting it is to sit in sociology when the wake n' baker from your TA section is sitting in front of you with a joint behind his ear...</p>

<p>Menloparkmum and anxiousmom: Yes, both USC Thornton and Shepard Rice are on MY list for him to study at! Forgot to say he wants to be a film composer, which also suggests outliers like Uni of Miami and Florida State Uni. Gonna be difficult to visit them all easily, as too far apart to drive!</p>

<p>On the double major, I assume your son is handling the workload Ok. Some folks tell us it is a bad idea due to conflicting needs (afternoon music workshops at same time as science labs, for example). I asked my boy for his second choice study which is physics.</p>

<p>hyperJulie: Mmm... good feedback! Certainly distracting with silly behaviour like that. A colllege with strong leadership and living by a few simple rules is needed.</p>

<p>" Some folks tell us it is a bad idea due to conflicting needs (afternoon music workshops at same time as science labs, for example)."
Well, to be honest, he has not been able to do a music minor at USC because of that very issue. His major is geophysics, [ which means he has had 3 sciences classes, all with labs, each semester.] Classical music is something that helped get him into the colleges he was accepted at[ he is an accomplished pianist], and he had hoped to be able to do a music minor, but the class schedule conflicts have precluded taking the composition classes he wanted to take.
Now if your son is a music composition major/ film studies minor, I can't think of a better choice than USC. I think that anywhere he goes a Music major/ science minor would be difficult to do in 4 years. It's more feasible in 5 yrs.</p>

<p>You may want to check into Chapman in Los Angeles, as they have a film studies program. Don't know anything about music there. Is NYU on the list too?</p>

<p>


</p>

<p>True in general. But I'd say a 45% response rate on a survey of the opinions of the top administrators of roughly 500 colleges and universities as to the "academic excellence" of their peers should count as a pretty reliable measure of the opinions of the top administrators of that universe of colleges on that question---which is all the PA survey purports to measure. They know what their opinions are. That's what PA measures. It's nothing objectively provable or disprovable. If they deem Podunk State the top university in the country, that would be their opinion, and the fact that they held that opinion would make it true. It's not like predicting an election outcome. </p>

<p>Now you may or may not think this is information worth knowing. Some, like hawkette and mini, insist it's worthless, or worse. I think it's valuable, because in academia reputation among one's peers is very much the currency of the realm, and the US News PA score, imperfect though it may be, is IMO an acceptable rough proxy for it. It doesn't mean schools with a high PA rating are objectively "better"; it only means that among a certain slice of of the top levels of academia they're PERCEIVED as being better, and that will influence things like a graduate's ability to get into highly competitive graduate programs, or the school's ability to hire and retain the most highly sought-after faculty. </p>

<p>Nor is it like predicting, based on this small sample, how the world beyond academia views these schools. Perhaps it would be useful to US News to do that kind of survey, too. Note that they do something like this in their law school rankings: in addition to a PA rating, they also do an assessment based on a survey of lawyers and judges---inter alia, the people who make the hiring decisions. Sometimes the opinions of academics diverge from the opinions of lawyers and judges, but neither is more objective, better informed, or in any sense "truer"---though it's almost certainly easier to construct a representative sample of academics than it is to construct a representative sample of the much larger universe of lawyers and judges.</p>

<p>They could do a survey as to how the consumers rate the academic quality and the quality of campus life of their education they are receiving. H. did. (26th isn't bad.)</p>

<p>^ I'd be all for that kind of consumer satisfaction survey as another useful data point, though I suspect it would be somewhat unreliable as a means of making inter-college comparisons, with students at some schools organizing to deliberately inflate their own school's rating (much the way you see a lot of hyperbolic blather in Princeton Review descriptions of some schools). It would probably be more useful on the negative side: schools with serious problems would hear an earful from their own students, and those problems would be exposed to the whole world. US News doesn't want to bear the cost of doing these kinds of broad surveys, however, and you'd never get the schools themselves to pick up the cost or adopt a common methodology.</p>

<p>By the way, if Harvard students rate Harvard #26 it doesn't mean Harvard IS #26; it only means Harvard students think so. Their information about what goes on at other schools is no more reliable, and probably less so, than the information held by college and university administrators. But it is some kind of indication of what Harvard students think about Harvard (or did in the year the survey was taken).</p>

<p>Actually, the college and university both paid for the survey, and actually USES the comparative information and rankings in making decisions. (It's one of the reasons why they try to keep it confidential, though H. had theirs linked.)</p>

<p>It means what it says - H. students ranked their academic quality and quality of campus life 26th (out of 31 institutions). Unlike Goucher administrations who probably couldn't find Albertson College on the map (and wouldn't know that it no longer exists), H. students have direct, firsthand, current experience. H. administrators know that, which is why they both pay for, and use the survey information.</p>

<p>Yes, but most Harvard students have no real basis to compare H with any other school. They are just projecting what they think might be a greener field somewhere else. Maybe H students' expectations are just not realistic. Or maybe it sucks.</p>

<p>I agree with Barrons (can't believe it!). I could not possibly publish a peer reviewed article in my field based on this kind of methodology. </p>

<p>These surveys are full of do-do: all kinds of biases, expectations and perceptions that can't be rooted in reality (given that respondents have little direct experience with comparative environments and respondents are not using comparable reference points). If anything these surveys are self-fulfilling: aside from the direct reality of one's own school, the next most influential information one has to rate schools are the ratings/reputations/rankings of the schools! </p>

<p>A little case in point. Our students complained about our class sizes in a national survey of students at different colleges. Our students were very dissatisfied with the class size, giving our school a lowly C-. Students at other schools, especially those schools that are 'famous' for small classes, were given As and A+ by their students for satisfaction with class size. Looking at the ACTUAL CLASS SIZE DATA across the schools, there was no difference! If anything two of the 'small class' schools that were rated so high actually had larger mean classes than the more lowly rated schools!</p>

<p>Menloparkmom: Thanks again for your useful feedback. Composition/film studies would be a dream option for him I am sure! And yes, we do have Chapman on our list, hence the reference above to affiliated colleges. And NYU, along with Oberlin, Mannes, Peabody, and Juilliard. Although my lad has high grades already for piano performance, he is not so interested in a conservatory, but we will visit some of the more progressive ones and see how they work. He does need orchestral work.</p>

<p>Good discussion above on survey methodologies. Have seen a number of TV reports on my visits to the US talking about college binge-drinking and similar, so know there is a serious problem for some campuses.</p>