US News College Rankings 2012

<p>

</p>

<p>Interesting hint, BB! If the decile discussed above is the top 10 percent yardstick used by USN in the selectivity index, the best hint is that such figure is nothing else than what the school WANTS it to be, and based on plenty of manipulation and exclusions. </p>

<p>The figure is a best guesstimate and, with the knowledge that organizations such as Morse’s could NOT care less about the accuracy of the report, it is a magnet for ranking boosting plays. Once schools discover that the methodology can be exploited with total impunity, why would not push the envelope to the limit? </p>

<p>The only question for the UC or Columbia of this world is why NOT report 100 percent? It’s not like Morse will call them up!</p>

<p>A few people have posted about the subjective survey of university administrators, with a reference to it as being the “collective opinion of ~ 2,000 academics”.</p>

<p>The index is not the collective opinion of academics. It is the opinion of university presidents (who often are not academics), frequently their assistants who are tasked to fill out the survey (mostly not academics), or admissions heads (almost never academics). In summary, it is a reputational ranking that because of its nature inidrectly gives extra credit to schools that are a bit more research oriented, hence the higher ranking of UC-Berkeley and Michigan vs say Dartmouth. </p>

<p>This is not to say that one is better than the other or vice versa, but no one should think that the so-called academic index is the opinion of a set of academic oriented persons.</p>

<p>

</p>

<p>That is correct, xiggi. It HAS to be an guesstimate since UC does not – nor cannot – collect top decile information. (Most public high schools, from where UC obtains the bulk of its students, do not rank, according to an article in the LA Times.) The only info UC has is a ‘ranking’ of the students that apply to UC (bcos it has their apps).</p>

<p>

</p>

<p>Recruited athletes (in both cases)? :D</p>

<p>

</p>

<p>Why don’t you stop trying to wiggle out of the C10 v C11 reference. USN reports using the C10 only. There’s no mention of gpa, w or uw, anywhere in USN’s ranking. The seeming top gpa’s of various API rank high schools can place students all over the map wrt the top decile, t-10%. For example, at Palo Alto and Gunn High Schools a 3.93 uwgpa would be < t-10% … with much uw grade inflation. </p>

<p>When I said UC websites, I meant the UC system’s websites, ucop.edu?, and statfinder. I thought I clearly delineated this when I said something to the effect of ‘UC sites reporting UC gpa [capped wgpa, 10-11, a-g] and individual campuses reporting uncapped UC gpa.’ By individual campuses within UC, I meant each’s admissions sites, not the CDS. Well, maybe only UCLA and Cal really show fully weighted, uncapped gpa, and only UCLA shows uw and w gpa of the admit and matriculant classes at their various levels.</p>

<p>If I wanted to reference CDS in any of the above, I would have clearly stated UCLA or Cal’s CDS, not some general ‘u site.’</p>

<p>

</p>

<p>Let me rephrase this: </p>

<p>

</p>

<p>And perhaps these kids who attend underfunded high schools deserve this, to boost their motivation to apply with lesser stats because the difference between haves and have-not high schools is growing greater every year, and the stat differential should be growing along with this. </p>

<p>

</p>

<p>Absolutely… </p>

<p>But lest you pick on UC and Columbia, there are far more egregious manipulation of the t-10% by many other schools (well I can’t speak for Columbia) with far greater differentials between reality and reported … legacies, special admits, special talents, etc. I guess UCLA’s differential to be ~ 17%. I’m sure there are some u’s with 30% or greater.</p>

<p>

</p>

<p>The UC’s reporting 95% or > is still ridiculous.</p>

<p>

</p>

<p>Given your response to the question (or lack thereof), I’m guessing that the answer is ‘no, no source’?</p>

<p>Absolutely … no source. Nice selective quoting again btw.</p>

<p>Let’s get UC to put forth this statement:</p>

<p>The University of California and its individual campuses will continue to coddle poor kids’ psyches to encourage these students to apply.</p>

<p>Baywatch, you either know it or you don’t. You seem to lack an intuitive feel on policies the UC invokes to try to encourage and subsequently enroll more economically disadvantaged students. UC is all about diversity, first and foremost, more than trying to improve its USN lot in life.</p>

<p>Is this all you got?</p>

<p>*few subjectives into the mix…and even by doing those admissions gymnastics they still can’t come up with student bodies that reflect Calif make-up.</p>

<p>============================</p>

<p>I don’t like the formulaic either. This produces robot kids, etc. This is why I like how UCLA integrates the performing arts and flim majors on campus. *</p>

<p>Oh yes…those in performing arts/etc majors shouldn’t be judged with stats since such right brain people may not test well or have great GPAs.</p>

<p>

</p>

<p>You fatally undercut your credibility on this by omitting—whether intentionally or unintentionally—the third category of persons who are asked to fill out the Peer Assessment survey, and that’s college and university provosts. Provosts are almost invariably academics (so are an extremely high percentage of college and university presidents, though you’re right in saying that some aren’t). The provost is the institution’s “chief academic officer,” the “dean of academic deans” responsible for maintaining academic quality, with oversight responsibilities for all curricular, instructional, research, and academic personnel matters. This is entirely an “academically-oriented” position; the provost is in charge, inter alia, of keeping tabs on how the institution’s various academic units (schools, departments) are doing, which is inevitably a comparative exercise. Every university I’ve ever been associated with is constantly benchmarking itself against a set of peer institutions—some a little above it in the pecking order, some at the same level, some just a shade below—to make sure it’s not losing ground or falling behind, to keep tabs on who’s gaining on it (and in which fields), who’s pulling away (and how they’re doing it), who’s made the best hires or lost top people, and so on. Provosts have more information about their own school and its peers than anyone in America; and since the provost reports to the president, any president worth her salt will have that information, too. I’ve never understood why US News asks admissions directors to fill out the survey. At most the admissions director would know who the school is losing cross-admits to, and who it’s trouncing, but other kinds of admissions data are already well-represented (probably overrepresented) in other parts of the US News ranking methodology.</p>

<p>So the US News PA survey is partly a survey of academics (provosts certainly, presidents usually) and partly not (admissions directors). The academics, especially the provosts, are in unusually information-rich environments. On the other hand, they’re operating at a high level of generality, so you don’t get the kind of fine-grained information you get from, say, NRC reputational surveys of faculty in a particular field.</p>

<p>I’ve always maintained that what provosts and presidents are going to know the most about is which schools have faculty that they envy, and which don’t. To the extent that information is fairly reflected in the PA survey (i.e., it’s not filled out by some less-well-informed assistant, and given that caveat that approximately one-third of the responses are from non-academics), IMO that’s a useful addition to the US News ranking, because it’s the only way faculty quality is even indirectly reflected in the US News rankings. Perhaps it’s because I’m an academic myself, and perhaps this is a minority opinion on CC, but I’ve always been of the belief that faculty quality is a big part of what makes a great college or university great. PA is a very crude proxy for it, but if you took away PA without replacing it with a better measure, the US News ranking would be completely blind to the very real differences in faculty quality from one institution to another.</p>

<p>

</p>

<p>Although you do not “fatally undercut your credibility” you are showing your bias by clinging onto a myth that has been debunked many times. Unless you believe that the president of Reed and a group of college presidents simply like to admit their limitations in this regard for the fun of it. </p>

<p>Your point that provost know more about their school than anyone else in America is acceptable. On the other hand, that they do “know” much about their peers depends entirely on one’s definition of peers. In the world of USNews the proposal that the group of peers comprises over 200 schools is entirely, completely asinine.</p>

<p>Even admitting that the people who are placing their name on the survey actually read and complete it (a tall order if there is one) advancing that this group of people knows more than a HANDFUL of school beyond the current level of vague reputation is utterly preposterous. </p>

<p>And that does not even touch on the fact that the instructions given to the respondernts is so simplistic that it allows anyone to reply on a free for all basis.</p>

<p>Define the term “academic” as broadly as you wish, nothing will make this exercise anything more than an unscientific and pretentious exercise of gamesmanship and cronyism. </p>

<p>Every “survey” proposed by USNews suffers from the same defects.</p>

<p>“Every “survey” proposed by USNews suffers from the same defects.”</p>

<p>…and many of the so called objective numbers that USNWR uses as “fact” have also been shown to be easily manipulated. They should be considered worthless to you as well.</p>

<p>^^</p>

<p>Told you, rjk, that I started to understand your point about the above. /insert smile. You asked what I found so misleading about Morse’s recent article. Well, it has LOTS to do with the issue of allowing a number of schools to “manipulate” the “objective data.”</p>

<p>Here’s the article: </p>

<p>[Ensuring</a> the Accuracy of the Best Colleges Rankings and Data - Morse Code: Inside the College Rankings (usnews.com)](<a href=“http://www.usnews.com/education/blogs/college-rankings-blog/2011/09/08/ensuring-the-accuracy-of-the-best-colleges-rankings-and-data]Ensuring”>http://www.usnews.com/education/blogs/college-rankings-blog/2011/09/08/ensuring-the-accuracy-of-the-best-colleges-rankings-and-data)</p>

<p>Year after year, the same manipulation takes place from the shores of the Bay Area all the way to the wilderness in Vermont. Acceptance rates? Oh yes, at the UC and Midd, we have those, but we do not count the Spring admits! Percentage of top ten percent in the enrolled class? Oh, here on Morningside Heights, we can’t possibly know such exact numbers, but let us count the ones we’d like to add up! And it goes on and on. </p>

<p>So, now in addition of having a totally worthless “peer assessment” to please the “academic” pirates who make a mockery of the exercise at Miami, Clemson, or at the University of Wisconsin, the USNews not only looks the other way but wants us to believe they have an ounce of integrity? Pass the pink bottle! </p>

<p>By clinging to the peer surveys and allowing cheats to manipulate the data, the USNews shows its true colors. And it ain’t too pretty anymore!</p>

<p>^^^^Thank you for your response xiggi. I see your point.</p>

<p>Cool. So a new year of rankings came out. Who cares? There are some real rankings-obsessed people, but I’m not sure what schools should rather replace the top 30.</p>

<p>

</p>

<p>LOL, the president of Reed College was a law professor and law school dean for most of his career before becoming president at Reed. It’s not surprising, then, that in 2005, three years after taking over as Reed’s president, he would profess a lack of knowledge concerning Reed’s peer institutions. He probably was, at that point, one of the least informed LAC presidents in the country about the overall institutional landscape among LACs. But that’s OK. He had only one vote, and anything he did with the PA survey would be diluted by other, and in the main better informed, voices.</p>

<p>I also suspect that some LACs may not have access to the kinds of comparative data that administrators at top universities rely on. </p>

<p>

</p>

<p>I happen to know my institution’s provost quite well. He’s a very knowledgeable and well-informed person, with access to reams of comparative data on other institutions, data that are actually used to help inform critical resource allocation decisions and are therefore carefully assembled and interpreted. He has the most data on schools most similar to ours, which for some purposes is a group of the 10 “most similar” public research institutions, and for other purposes is a group of the 20 “best” public R1 research universities. But he also knows which schools didn’t make that cut because the data showed that, for resource reasons, they really aren’t our peers and competitors; and which are in a smaller group of schools that are so richly resourced that they aren’t our peers and competitors either. </p>

<p>Then, of course, there are a number of additional research universities where he has done comprehensive site evaluations as chair or member of an accreditation team; 3 other research universities where he has been on the faculty at various points in his career, including one as dean and another as associate dean; 3 additional research universities where he has held visiting appointments; and his own undergraduate and graduate alma mater institutions with which he maintains close ties, as well as the dozens of institutions he has visited for academic conferences, symposia, and other events, and the dozens or hundreds of faculty member he knows at schools across the country who are ever-ready to spill their guts about their school’s latest coup or pratfall. </p>

<p>He has access to useful information on all the Big Ten schools, plus the University of Chicago, plus for some purposes the University of Illinois at Chicago and the University of Wisconsin-Milwaukee, through the Committee on Institutional Cooperation, the Big Ten’s academic cooperation arm. (For example, a quick glance at a report on the CIC’s Traveling Scholars Program which allows graduate students at any CIC school to take advantage of unique courses, specialized library collections, or unusual laboratories at any other CIC school without change in registration or additional tuition reveals that students form other CIC schools flock to Michigan and the University of Chicago in large numbers, while all other CIC schools are either net exporters or neutral, which says something interesting about the strength and breadth of Michigan’s and Chicago’s academic resources). He keeps close tabs on other universities, public and private, in the state, as well as in neighboring Wisconsin and in other states where we are competing for students. His desk is stacked high with accountability reports and budgets requests from his deans and department heads, much of it based on competitive concerns—this school is gaining on us in this area, this other school has stolen a march on us and we need to catch up, this one is having our lunch on entry-level and/or lateral faculty appointments, this one is falling apart and creating strategic hiring opportunities if we get our hooks into their top faculty before so-and-so does. This isn’t “vague reputational” stuff; these are the day-to-day competitive pressures faced by a mid-market, multi-billion dollar enterprise in the highly competitive industry known as higher education. In that competitive business, the provost’s office is the most information-rich environment there is, not only with information about our own institution, but about competitors near and far who will seize any opportunity to catch up to you or get further ahead if you’re asleep at the switch. The provost of a major university would know about as much about his competitors as, say, the managing partner of a major law firm would know about other law firms competing in the same markets. He’ll know some well, some less well, but he’ll know which his real competitors are, which are not really competitive with his firm, and which regularly take the high-end business from him, leaving his firm to scramble for what’s left. And that’s about the level at which the US News PA survey 1-5 rating system asks for information—wherever you rate yourself, which schools are your equal or better, and which not as good.</p>

<p>This really isn’t rocket science, and successful executives in the higher education industry are on the whole not nearly as ignorant as xiggi makes them out to be.</p>

<p>^^^The person who used to have his job is now the president of UVA. :-)</p>

<p>All very good points. But the salient fact is its a marketing tool for USNWR no matter how they manipulate numbers (or the schools for that matter.) Its insidious and odious and most college presidents and provosts will tell you they abhore these rankings, but have to live with them and appear to be happy about the results. </p>

<p>We live in a hyper competitive world and that includes college admissions for the schools and the students. Smart people will peruse the rankings and smile and make a general observation, but not make decisions about attending schools based on these numbers, nor fill their heads with hubris if their school is at the top of the heep. They are not a measure of one’s intelligence, one’s likelihood of success (however that is measured in life), or one’s superiority over any other human being. </p>

<p>If you ask employers who their most valued employees are, they will pick people who are well balanced, well adjusted, hard working and get along with their peers, accepting challenges with aplomb. It has nothing to do with where they went to school, SAT scores or USNWR rankings. That goes for WallStreet to Main Street. I’ve heard it over and over for decades. </p>

<p>If your school is at the top of the list or top 25 (or top 50), congratulations. But what impresses people more is character and hard work/results on the job. </p>

<p>I remember when my first child went off to college and the insidious admissions wars going on between their school “friends” and the unbelievable attitudes adopted by kids and their parents. I suppose it brought the truth to the surface about their character. Nothing wrong with being proud or doing your best and reaching for stars (within reason). But the level of competition was very distasteful and bitter. By the time the second child was applying, we had a much more distant and removed (from the elitists and cutthroat types) approach. </p>

<p>The ultimate goal should be your child’s success at whatever school they attend. And their happiness. If you achieved that at any school, from Division 1 to Division II or Division III, in tier 1, 2, or 3, then congratulations. </p>

<p>Do SOME corporations and executives play games with “status” of recruits? Of course. But I can also tell you that many kids at lower ranking schools have done well on WallStreet or any other Fortune 500 company, or in professional/grad schools. </p>

<p>Obsessing about these rankings is really silly. But it happens every year.</p>

<p>

</p>

<p>Great, Clinton, your good friend the Provost has reams of reports and statistical analyses of your institution’s peers. How does that compare to the survey he fills out? How many peers are there compared to the couple of hundreds listed in the typical survey. </p>

<p>Fwiw, all the discussions would be rendered moot by having three simple elements:</p>

<ol>
<li>Full disclosure of the identity of whoever completes and signs the survey. Call it the Academic Sarbanes–Oxley element.</li>
<li>Make the ENTIRE survey and ENTIRE CDS public.<br></li>
<li>Have the USNews share which survey they threw out … as they pretend to do.</li>
</ol>

<p>Why don’t you ask your friend to share the last three surveys he completed and let the world see how many schools got “measured” and how many received a N/A. </p>

<p>I would be more than happy to apologize for my comments. All it would take is say 50 percent of schools on the survey receiving a DO NOT KNOW mark.</p>

<p>In the meantime, would you mind commenting on the grading of your school by your friends in Madison?</p>

<p><a href=“http://insidehighered.com/media/news_documents_and_files/2009/08/madisonform[/url]”>http://insidehighered.com/media/news_documents_and_files/2009/08/madisonform&lt;/a&gt;&lt;/p&gt;

<p>Or, I may save your time. Except for Wisconsin and a New School in New York that got the highest possible rating, “distinguished,” every school listed is marked as “adequate.” Those 260 “adequate” institutions included Harvard, Yale and the rest of the Ivy League, the University of California at Berkeley, the Massachusetts Institute of Technology and Stanford University. Only Arizona State University scored below all the rest, given the lowest rating of “marginal.”</p>

<p>I am sure the person who filled the survey spent sleepless nights reviewing his reams of data before answering. I wrote filled the survey and NOT signed the survey because:</p>

<p>

</p>

<p>Seriously!</p>

<p>I’m still curious why that Inside Higher Ed survey supposedly from Wisconsin has a dailycal.org link?!</p>

<p>Note: Schools are sent 3 surveys. This is one survey out of ~2000 returned. Any strategic voting is washed out with the other results…besides, they throw out high and low scores.</p>

<p>

</p>

<p>Yeah, right! And how would that work? Why would anyone believe that the entire voting is not strategic? </p>

<p>UCB, when we sit down to smoke our peace pipe, we can also help ourselves to a few bottles of the firewater that Morse has been peddling. We can place his story about throwing out extreme results on the same level as that attempt to “sell” his five-step integrity process. It’s nothing but blah-blah in the face of mounting and untenable criticism. </p>

<p>Why would anyone believe that Morse’s skeleton staff spends any time analyzing the accuracy of the inputs when it is so easy to uncover that misrepresentations are not challenged. For instance, someone such as you with his finger on the pulse of everything Cal should know that the reported acceptance rates of Cal do NOT correspond to the data published by UCOP. That is black on white. Of course, I would not expect you to acknowledge that the report of close to 100 percent of enrolled students emerging for the top ten percent belongs in Anaheim’s Fantasyland! </p>

<p>Your friends who love to defend the PA by challenging the accuracy of the objective data were onto something. Lee Stetson used to joke that his office could never overcome an audit of the numbers Penn disclosed to reporting agencies. It is becoming obvious that the race to the USNews top involves playing games and stretching the truth when convenient and feasible. </p>

<p>In plain English, that amounts to … lying through your teeth for as many categories as you can!</p>

<p>Xiggi, I’m just curious to see Wisconsin’s other two survey results for that year… Perhaps the other survey takers filled out the survey in a way that better matches your subjective opinion.</p>

<p>And why should the survey results be vetted with CDS numbers?! USNWR has numerous objective measures that capture the CDS data. Peer assessment is measuring subjective opinion…it doesn’t have to correspond to some objective data.</p>