New Forbes college rankings: some surprises

<p>^^You just said, college choice is a subjective thing. One school is perfect for person A, but an awful place for person B. The earth revolving around the sun is a fact that is true for person A, B, you, and me. If you see a college ranking that places Harvard at #100, Williams at #293, and a Penn State branch campus in the top 10 would you think that the metrics used to rank schools were good or plain awful? If Forbes had used the internal professor rating systems in addition to Rate My Prof and the career services surveys instead of Pay Scale (or in addition to) maybe the rankings would have come closer to a mainstream view for some of those outrageous outliers.</p>

<p>Does anyone have the link to the website that was circulating awhile ago that let you make your own rankings (some of the info was out of date, non existent, or unavailable) and apply your own weights to it?</p>

<p>Modest:</p>

<p>B = “Your college might not be as good as you think it is”
N = “Your logic isn’t very good”</p>

<p>I’m saying : “if N, then B.”
You are saying that I’m saying: “If B, then N.”</p>

<p>And regarding your other point, I’m saying that the objective stats that various rankings might use to choose the “best” colleges ( like % of alumni who contribute, classes under 20, and average indebtedness after graduation) might be useful to people who place a lot of weight on certain criteria. Other data, which is subjective, like peer assessment and happiness with faculty, might also be useful to people who trust the methodology used to get the info (even if you and I do not). What I’m saying is I think it’s fine for groups to publish their lists of how schools perform in specific areas, as long as they make it very clear how they gathered the info. In other words, if someone wants to publish a list of which schools did well on Rate My Professor or whatever, that’s fine, as long as they make it abundantly clear just how imperfect the data is. If a person has immigrant parents and a lazy guidance counselor, such info might be better than nothing.</p>

<p>My main gripes are (1) that USNews and Forbes and others gather a bunch of quantitative data ( some objective, some subjective), and then package it as a definitive “Best” list; and (2) people who should know better put so much faith in such lists that they actually think a school ranked #12 is “better” than a school ranked #25.</p>

<p>

</p>

<p>It’s pretty funny that you chose WUSTL (12) and UCLA (25) because many people on CC think WUSTL is worse than UCLA. IDK why some people hate WUSTL so much just because it is ranked higher than Cornell, Brown, and Dartmouth on USNWR.</p>

<p>I agree with everything you said, Schmaltz, with one important extension. Flawed as both of these rankings are, and I really do despise the concept of ranking in general, one can qualify the results of each list versus one another. It’s actually quite clear that the data collection process USNews goes through has far higher integrity and heft, even if I disagree with how they prioritize and interpret that data. Other than PA, which have serious issues with the collection and integrity of, USNews is at least using metrics which most schools have and can report in a universal way, and these metrics are clear measures of something specific. The Forbes list uses data which is inconsistently collected across most schools, in fact, I would say erratically collected, and that data does not even clearly represent what Forbes is claiming it could (even if the data were collected uniformly and accurately).</p>

<p>Therefore, while I don’t think the end result of any ranking system is really all that fruitful, at least USNews is using data which is not inherently useless to draw their conclusions, whereas Forbes fails a step further back in the process.</p>

<p>Hey guys, quit your complaining. Didn’t you read the title? AMERICA’S BEST COLLEGES. There it is … in print. It MUST be true! The only thing that surprises me is that there hasn’t been a tidal wave of transfers from Michigan, Carnegie Mellon and Southern Cal to higher ranked schools like Cedar Crest College, Mississippi State and Drury University. </p>

<p>BTW, Congrats to the latter three on your excellent rankings!</p>

<p>Maybe this has been brought up in the 17 preceding pages, but we have been somewhat amused in our office to find an article in Forbes from 10 years back, complaining about what a ridiculous publication “Who’s Who in America” is. Now they publish a college rank that relies on it. Who on the editorial board missed that one?</p>

<p>For those complaining about the subjectivity of the Forbes ranking: </p>

<p>25% of the almighty USNWR ranking is based on peer assessment, which is totally subjective. No college president can have in-depth knowledge of the strengths and weaknesses of every other college he/she is asked to evaluate. Inevitably, the peer assessments are based on reputation, name recognition, and incomplete information – in other words, subjective and faulty criteria. And yet this is the cornerstone of the ranking that so many take so seriously.</p>

<p>Really, the whole business of ranking is absurd: the notion that an entire institution can be reduced to a single number, the idea that every student is looking for the same thing in a college – ridiculous. You people do realize that you’ve fallen for a marketing technique whose only purpose is to sell magazines, don’t you?</p>

<p>

</p>

<p>LasMa,
I agree with your broader critique of rankings, but I must say I don’t quite understand the criticism of the US News Peer Assessment rating. In a certain sense, all you say about it is true—but so what? It’s just what it purports to be, a reputational survey among industry insiders. </p>

<p>We do this sort of thing all the time. We rely on the views of restaurant critics, movie critics, theater critics, car critics—all subjective and imperfectly informed. We look to consumer satisfaction surveys to help us buy automobiles—subjective and imperfectly informed. We ask subjective and imperfectly-informed students to rate the effectiveness of their professors’ teaching—and sometimes careers may hang in the balance. In making difficult national security decisions, the government relies on National Intelligence Estimates, a kind of summary of the information, insights, and best guesses of a range of experts, all subjective, all with incomplete information. Businesses, governments, universities, and other complex organizations often rely on “expert judgment elicitation” methods to “assess products, systems, and situations for which measurements or test results are sparse or non-existent.” In none of these circumstances do we complain that the expert lacks complete information or that the judgment is infected by “subjective” elements—though all are fairly so characterized. Sometimes we get good results from these approaches, sometimes not—but more often good than not, which is why they’re so widely used, however imperfect they may be. Expert panel approaches can be infected with “groupthink,” “information cascades,” and other distortions. But even so, these approaches often give us far better information than any other approach available given the hard-or-impossible-to-measure nature of what we’re trying to assess. Not perfect information, to be sure, but better than no information, which is where we’d otherwise be. </p>

<p>What’s curious about the US News peer assessment rating is that it is, almost by definition, accurate. That is, all it’s trying to determine is the relative reputational standing of various colleges and universities among their peers. The only way we could possibly know that is to ask their peers. And that’s exactly what the PA does. So if school A scores a 4.0 in PA and school B scores a 3.0, then we can pretty safely say that school A is more highly regarded by its peers. (Of course, individual schools may try to overrate themselves or lowball their competitors, but across the full sample these distortions should cancel each other out). That doesn’t necessarily mean school A is objectively “better” than school B; only that it’s more highly regarded by its peers. And that tells us something.</p>

<p>Where we might differ is on whether the PA rating tells us anything of value that we might want to consider in evaluating colleges. To my mind, it is a relevant factor, mainly because nothing else in US News comes even remotely close to telling us anything about the quality of the faculty, and based on what I know about what colleges presidents and provosts spend time worrying about, what they spend money on, and how they evaluate their own schools against their peers, faculty quality is their principal preoccupation, and consequently (I suspect) the biggest determinant in the PA rating. This is something they do know about, and track very closely. That is something I want to know about when evaluating schools, and PA gives me at least a crude proxy for how a school’s faculty quality is perceived across the industry.</p>

<p>I’d also love to know which schools are held in high regard by employers who regularly hire college graduates. US News doesn’t give us that information because it would be too costly to assemble, and too controversial as to which employers to include in the expert panel; but if it could provide that information, many people would think it valuable and relevant, even though it would be just as “subjective” and based just as much on incomplete information as the PA rating. In some of its other rankings, US News does attempt to measure this. For example, in its law school ranking it uses not only PA but also a quality assessment rating by expert panels of lawyers and judges. In its medical school ranking, it uses not only a PA rating but also an expert panel rating by hospital residency directors, in effect the entry-level medical employers. It ranks MBA and graduate engineering programs not only by PA rating but also by an expert panel assessment by corporate recruiters–again, the entry-level hiring people. It ranks education schools using not only PA but also an expert panel assessment by school superintendents—again, entry-level hiring experts. Subjective? Sure. Based on incomplete information? Of course. Valuable? You bet.</p>

<p>^ That post should be stickied to the top of this board for definition of Peer Assessment and why it’s valuable.</p>

<p>Don’t know about you guys, but it seems like we’ve worn out this topic. I heard next week the “17 Magazine” rankings come out. 90% of the scores are based on the stats for male profs from hotforteacher.com, and the other 10% are based on stats from howmyfemaleteacherdresses.com. See you back here in a week to discuss it. Modestmelody, you better be ready to defend the pantsuit-wearing professorettes at Barnard.</p>

<p>“Where we might differ is on whether the PA rating tells us anything of value that we might want to consider in evaluating colleges. To my mind, it is a relevant factor, mainly because nothing else in US News comes even remotely close to telling us anything about the quality of the faculty, and based on what I know about what colleges presidents and provosts spend time worrying about, what they spend money on, and how they evaluate their own schools against their peers, faculty quality is their principal preoccupation, and consequently (I suspect) the biggest determinant in the PA rating”</p>

<p>To many posters here on CC they feel the student selectivity and quality is the most important factor in determing which school they would/do attend. They see any school that does not have the highest quality students in this country as being inferior in some way. In a way, a lot of USNWR’s rankings reflect this attitude.</p>

<p>Haha Caltech Number 3???</p>

<p>Caltech is a FANTASTIC place to be a professor and/or do grad work, but for undergrad that seems ridiculous…</p>

<p>^ Care to justify?</p>

<p>Wasn’t CalTech #1 on USNWR a few years back?</p>

<p>^Yeah, before HYP complained.</p>

<p>I know this thread is old, but after combing through it you people make me sick. Honestly, who cares? If you guys get so ****ed off about a list that really shows your maturity. It is pretty obvious that this list does not solely base their ratings off of academics. For all we know, it is basing it’s list off of the overall quality of people being produced from these colleges. It seems a lot of you are getting angry because the godlike schools are not placing in the top 10 on a list for once. Maybe it is because you all are applying to those schools, not for the people it produces, not because you like the people there, but for the prestige. So you can be able to say, “Yeah, I go to Harvard.” That is shallow. A lot of you will deny this but deep down it is the only reason anyone applies to Harvard or Yale, etc. One should base their thoughts of a college off of the kind of people it produces, if the students there seem to mesh with you, if the professors there are truly nice. My teacher was a professor at Harvard and all her children went there. She tells me it is, indeed, a great school. However, she also says the kids who go there are so self-righteous and cut-throat it is disgusting. She told me how kids would do ANYTHING for that extra point. (For example, stealing notes.) I realize this problem is at all colleges; however, it is more prevalent at top tier schools filled with shallow minded students that are the product of their parent’s wealth. I find it hilarious how colleges like the ivies weigh so much on SAT scores for the reason of score buffing programs like Kaplan and Princeton Review that cost thousands of dollars. That is something not everyone can afford. </p>

<p>West Point is a school that offers brotherhood, growth in character, and it is a free education if you get accepted. The people go to West Point not only are tested mentally, but are also tested physically. Also, when they graduate, they have to serve in the military so all students at this school are going to risk their lives fighting for all of you. While students graduating from Harvard Business School are sitting behind a desk fattening their wallets. (Only used West Point as an example.) </p>

<p>In the words of the inspirational Martin Luther King, Jr. “Intelligence plus character - that is the goal of true education.”</p>

<p>Cracks me up how people on this board get so upset about rankings of colleges that don’t fit USNWR…which is just a proxy for some ‘prestige’ that is self-perpetuated by the USNWR rankings themselves. To those who write things like, “But I’ve never even heard of X college” I wanna say, and you are who exactly? An employer? An academic?..I say who cares of Mr. Smith down the street hasn’t heard of a college. Seriously, do you really think one should choose a school because one has heard of it? It appeared in movies or sportscasts? You’ve seen the sweatshirt? </p>

<p>As others have aptly clarified, ‘best’ is however its defined. There is not some absolute means by which to determine it. Every ranking system has its own unique criteria and so long as those doing the ranking system clarify it, one can judge if that criteria matches their own priorities or not; but to snub one system over another is absurd! They are all bogus in their own way.</p>

<p>For those that give a hoot about ‘prestige’ and reputation, they can memorize the USNWR lists or any other ranking system is. For those looking for something OTHER THAN prestige or whatever USNWR is weighted by, these other ranking systems can be extremely valuable.</p>

<p>I frankly value more and more different ranking systems. It shows how absurd blindly following one set of criteria or some news magazine really is. Maybe its time people started putting more emphasize on factors that are important to THEM, based on specific RESEARCH they have done related to their educational gaols, rather than follow like sheep toward a brand that- I assure you- largely the same as the next brand…then again, I won’t get my hopes up…so far the US university system has managed to make many a student and their family think it’s a bargain to pay $50,000 a year for a degree.</p>

<p>Anyone who wants to justify this list, I tell you this, write a 100 page research paper where 25 pages (25%) are all cited from wikipedia (equivalent to ratemyprofessor.com … probably worse). If you get an B (which is the grade I would give for rankings like this to be acceptable) you can come back and argue.</p>

<p>I would argue that it is at least as valid as the annual Princeton Review rankings which are based on even less research than ratemyprofessor.</p>