<p>One could make any argument – isn’t that what debate teams do? – but I would not make that one.</p>
<p>IMO, internationals are much more prestige-focused, primarily on colleges that offer strong research-based STEM programs (sorry, Mudders), and of course, Wharton for the “commerce” wannabes.</p>
<p>I did not know Stanford was so full of deadwood. In the US I think use of such rankings is FAR more popular among higher quality students. Many of the rest just go some place near home.</p>
<p>And some internationals love the attraction of sunny beaches, proximity to Rodeo Drive, not too hard admissions for full paying customers, and plenty of fancy cars to buy and drive. That brings UCLA and USC right in the middle of the action! Not to mention UCI and much lower ranked schools. Not all experiences have to be anchored by … academics. </p>
<p>Ask Alexandre about how applicants think in his world!</p>
<p>International includes more than China. There are other countries that are less obsessed with academic prowess and where its citizens possess beaucoup dollars. </p>
<p>By the way, I also included … ease of admissions in my description!</p>
<p>USNWR is prone to making simple mistakes. They got the OOS tuition of Berkeley wrong by over $10,000. This puts their ranking integrity into question. </p>
<p>On the USNWR website, the OOS tuition was declared to be $25,056 for Berkeley. As two Cal website show, total tuition costs for an OOS student are around $35,000 (tuition plus OOS supplement). </p>
<p>The mistake they made was adding the tuition/fees for one year<a href=“as%20determined%20by%20website%201”>/U</a> to the OOS supplement for one semester. They then added the student service fee and campus fee (as determined by website 2) </p>
<p>The total tuition costs from those mistaken calculations comes out to $25,055. </p>
<p>They did not include the other costs as determined on either website, such as the document management fee.</p>
<ol>
<li><p>Tinkering isn’t good in longitudinal studies; you have to measure things in the same way every year, otherwise you can’t make meaningful comparisons across years. But USNWR has never claimed to be a longitudinal study, so I think this is moot.</p></li>
<li><p>Let’s not be obtuse and pretend that USN doesn’t bank on the assumption of preciseness and rigor that’s not there. People aren’t simply misinterpreting; USN is marketing themselves as rigorous and precise, which is why they publish their methodology, too.</p></li>
<li><p>The point is, it makes the rankings less than reliable because they’re based on faulty information.</p></li>
<li><p>No, but judging a school in large part on the amount of spending per student is - especially if there’s no indication on what that spending is FOR. Spending more money per student to attract proven professors or build new study spaces is a great thing. Spending more money per student to build a rock-climbing wall or re-plant the flowers every 2 weeks has no bearing on academic achievement.</p></li>
<li><p>It’s not a stupid idea just because it’s not currently practicable. It may be impractical not not stupid. And it’s still a valid criticism of the USN rankings. They’re all about input and very little is about output. But job placement after college would be an easy metric to add, since most colleges keep records on that.</p></li>
<li><p>I would argue that peer assessment is actually probably less important to students who desire an academic career than it is to student who want to go to work immediately, especially in prestige-driven fields like banking and finance. Graduate schools put less weight on where you went and more weight on what you did there. Besides, at this point nobody cares where I did my BA. But even if that were true, I would imagine that the number of students who intend an academic career is vanishingly small, so this is not a useful point.</p></li>
<li><p>Again, the author is using this as a reason to <em>ignore</em> the rankings, not a reason why the rankings are not good.</p></li>
</ol>
<p>And just to be clear - I don’t agree 100% with the author that we should completely ignore the U.S. News rankings. I think they are useful, as long as one doesn’t take the precise numbers too seriously and instead thinks about relative groups of schools/programs together. I just think that they are indeed flawed, and that exposing the flaws in the methodology is a useful exercise. Still, ANY ranking system (just like any methodology in the world) is going to be flawed in one way or another. I just think USN rankings should be used along with other information to make decisions about college choices, and should never be used in isolation.</p>
<p>The article in the Yale Daily News says “Yale is still on the podium.”</p>
<p>I’ve just added a revision to the prestigiosity scale in the old thread. Last year (or so) we discussed whether a separate scale was needed for STEM schools, but I haven’t overcome all the challenges involved in doing that, including caring enough.</p>
<p>Happened to be in Palo Alto last weekend, and driving down the street I came across McLaren dealership! (At $250k per, you too can drive Formula 1 technology.)</p>
<p>Of course, mclaren also has a dealership in Beverly Hills. :)</p>
<p>How can it be a valid criticism when the data doesn’t exist (and may not for a decade)? How is it fair to criticize Morse for not collecting something which only the feds have the authority to obtain (but have chosen not to)?</p>
<p>More importantly, even if the outcomes data were available, it is pretty much meaningless without context. (Yes, vocational majors can get jobs. Yes, liberal arts majors struggle to get jobs.) But does that suggest that an unemployed Lit/Art [fill-in-the blank] major at a school was not “educated” well? Should traditional liberal arts colleges be encouraged to open/focus on vocational majors to boost their job numbers?</p>
<p>That’s like saying the selectivity in choosing the pizza dough going in to the oven is a valid means of measuring the quality of the oven. </p>
<p>Clearly, measuring the quality of ovens by looking at which ones have the most selective pizza dough going in to them is ridiculous. It’s irrelevant. As such, the basis of your prejudice is flawed in the context of college rankings. We are ranking the schools, not the entrance stats… A good school will offer the tools one needs to succeed. Whether or not that person chooses to use those tools is a matter of personal choice.</p>
<p>“Clearly, measuring the quality of ovens by looking at which ones have the most selective pizza dough going in to them is ridiculous.”</p>
<p>Yet I can easily imagine a correlation: Cheap dough could be found going into cheap ovens, and quality dough into quality ovens, because there is an external influence affecting both dough and oven.</p>
<p>The lack of relevance comes from offering a rather poor analogy, even if one could dispute the proposal that the oven does not make any difference. For instance, an oven that does not keep uniform temperature might yield uneven pizzas. Further, the magnificent ovens that come from Italy and reaches high temps are clearly MUCH better than the run-of-the-mill ovens you’d find a cheap pizza parlor. </p>
<p>And, most importantly, if you have to compare ovens, you also need to ascertain the qualifications of the operators. You could make an analogy between the poorly trained and inexperienced “cook” who throws your pizza in the over at Pixxa Hut to a TA or GSI, and then make a similar one to Pietro, the chef from Florence who has cooked gourmet pizzas for 30 years! That would be your experienced teachers who is paid to teach!</p>
<p>And lastly, you could add the quality of the dough. Some use some cheap frozen dough; other use the finest flour from the best mills. The difference is the price! And you could add that the consumer who ends up eating the pizza might be different. After all, someone who can afford Spago will not go sit at a smoke-filled pizza joint, and fully expect a gourmet pizza to be quite different. Here your educated customer can be the discerning and well-prepared student WHO can afford the choices provided by a superior qualification! </p>
The reason I don’t think this is a good analogy is that for a college, getting better “dough” makes the “ovens” better over time as well–that is, getting better students leads to more prestige which leads to recruitment of better faculty, and so on. I understand Xiggi’s views about TAs, etc., and share them to some extent–but honestly, is there any college with mostly really top-notch students that isn’t a very good school? I don’t think so. Obviously, some educate better than others.</p>