Sarah Lawrence & US News - another monopoly

<p>


</p>

<p>A very cogent argument, a useful parallel, I think. I only refer to Consumer Reports for things about which I am completely ignorant, and don't care about. When it comes to cars or cameras or computers I find that CR's criteria are only vaguely similar to mine, and they have little value to me. The good thing is that they publish their criteria so I can at least decide to ignore some or all of their evaluation. When I was looking into major appliances -- about which I know absolutely nothing -- they were a good starting point.</p>

<p>When WashDadJr started the college process last year I referred to USN&WR quite a bit. It's really good for factual information, for example. As I've learned more about the process, the schools, and my son's profile as a college candidate, I worry much less about USN&WR's ratings.</p>

<p>What would be really handy would be a spreadsheet with all of the US News data, with the ability to modify the weightings of the different criteria. I suppose I could do it myself, but I just don't feel like it. :)</p>

<p>d:</p>

<p>I also would like accurate data. But what does one do in the absence of accurate data? Well, you could drop the school entirely off the list. I guarantee you that would lead to a huge amount of screaming. You could give zero points for any item for which you have no data. Another huge amount of screaming. You could give average points, which will incent every college that is below average on one measure or another to stop reporting that number. You can take historic numbers and use them year after year, which is also not accurate.</p>

<p>Or, you can give partial points in a way that won't incent many schools to go to non-reporting of data.</p>

<p>Hey, I would like independent audits. I don't believe certain numbers (like Berkeley's 99% of students in the top 10% of their class). But I'll take what I can get.</p>

<p>mini:</p>

<p>Yes, I'm very aware of the NSSE and very aware that many schools either don't participate or keep the results to themselves. Are you seriously suggesting that US News forego data that are available for the vast majority of schools in favor of data that are not? </p>

<p>As for your assertion that making up questions (sic) and administering a questionnaire is somehow "wrong," thank you! You've just blown practically all social sciences right out of the water. I can go home now. Nothing to be done here.</p>

<p>"I also would like accurate data. But what does one do in the absence of accurate data?"</p>

<p>You don't make-up inaccurate data.</p>

<p>I hope that is not what researchers, professors and students do, when they don't have accurate data. I hope they don't make-up the data.</p>

<p>I also think there should be a big disclaimer from USNWR. "The data we use to rank schools has never been proven to actually measure the quality of an education. So the rankings we use are fiction. And most of the data is true, and we hope you find it useful, but some of it, we made-up."</p>

<p>Oh please. Researchers use proxy data all the time when real data aren't available. They're always based on assumptions which are always given in the section on methodology, most often with some discussion of the strengths and weaknesses of various approaches.</p>

<p>But, hey, if dstark thinks that US News should drop it's entire system because dstark disagrees with it, there's not much I can do to counter that.</p>

<p>Personally, since SLC has not left me with the choice of getting reported data, I would either drop them from the rankings entirely or give them a zero for the non-reported data, but I'd prefer dropping them.</p>

<p>"But, hey, if dstark thinks that US News should drop it's entire system because dstark disagrees with it, there's not much I can do to counter that."</p>

<p>I didn't say that. Don't put words in my mouth.</p>

<p>I like that USNWR publishes data. I don't want the data made up.</p>

<p>dstark:</p>

<p>Then recommend an option in the absence of data. Drop SLC? Give a zero for missing data? What?</p>

<p>dstark:</p>

<p>How would you like SLC listed? Should USN&WR go for alphabetical listing and include only data that was reported by colleges? In the case of SLC, the SAT score data would be omitted. Would that be better?</p>

<p>To all:</p>

<p>How is Peer assessment conducted? Is Harvard dean supposed to assess Albertson college and vice-versa or is the peer assessment conducted by self-reported peers (i.e, Yale, Princeton et al assessing Harvard but not Albertson)?</p>

<p>My last comments on CU or cars, since I've alwys been told if you argue with pigs in the mud you both get dirty but the pig enjoys it. And we're obfuscating the real issues just to hear ourselves talk.


But that is the issue: CU doesn't report how their results are compiled. And the average consumer obviously doesn't want a car "much worse than average" in electrical problems. If CU had said that 90% of those problems were broken antenna (and remember they won't divulge even how many people rated the Jag versus how many chimed in on their Camrys, and even ardent statistical apologists like Tarhunt will tell you an insufficient sample size would disqualify much of the results), maybe people could have come up with reasonable conclusions. But if USNews isn't telling people that they simply made up figures for SLC's SATs, that's just being disingenuous.</p>

<p>And


We've dumbed-down cars the same way we've dumbed-down the college selection process. "Better" cars are self-selected, like some LACs, simply because a handful of thinking breathing humans might actually enjoy the experience of driving a vehicle that rewards them by offering superior performance and handling. (That's not snobby; many of those cars work just fine at 100,000 miles and cost less than a new Kia.) But the appliance driver doesn't care. So the compromise from JD Powers surveys and CU surveys is that manufacturers dumb-down service intervals so they don't have to make the owners come in for service and have reasons to complain. No one can tell me life-time transmission lubrication is good for cars, or that 100,000 miles between spark-plug changes is better for fuel economy or performance. If you agree with those posits, then you know nothing about mechanics. Crash-worthy-ness is a different issue, a moving target and has caused Korean-built cars to actually take into account how they'll be tested by the IIHS, then slap a 10-year warranty on it knowing they'll just void it for lack of service by then. But it doesn't improve the driving dynamics, or durability, or increase active safety; just passive, belt-yourself-in-and-crash-into-a-wall safety. That's what statistics give you. Sure, it may drive the worse products to improve but they're "teaching to the test" as WUSTL is now accused of doing when they provide merit aid to increase their student body stats. It's still up to the Banfieldian "upper-class" to determine which among the other schools suits them best; do they value horsepower over handling? Choose the Dodge Magnum school. Bling in facilities over all else? Chrysler 300 school. Long-term quality of life and an experience that meets the demands of a real driver? Go European makes. Appliance? Toyota, of course.</p>

<p>Done.</p>

<p>I would provide a list of the schools, with known data, without a ranking. </p>

<p>So SLC would have a blank where SAT scores would normally be listed. </p>

<p>It seems pretty obvious which schools have wealthy student bodies. Many students want to go to schools with wealthy student bodies. I like that category, and would like to see it added along with percentage of kids on Pell Grants.</p>

<p>I like PA. Why do I like it? Because my old school and my daughter's school do well in that category. I am glad somebody has high opinions of the schools. I don't know who, but somebody. :)</p>

<p>If those schools drop in that category, than PA should go. ;)</p>

<p>dstark:</p>

<p>There's a lot of merit in such a listing. But the downside is it is most helpful to readers who already know the names of the schools whose data they want to look up and least helpful to applicants who would like to compile a list of colleges they might not have thought up.</p>

<p>Anyone can look up the data on HYPSM even if they are listed very far apart--and be confirmed in the impression that these are great schools. But it would take serendipity for a reader not from the Midwest to look up St Olaf or Grinnell unless that reader was already pretty knowledgeable about colleges, let alone identify colleges to which these should be compared.</p>

<p>If it hasn't already been answered: USNWR reports scores from test optional schools and adds a footnote indicating that test is optional. You can go to common data set for the school to find out the percentage reporting.</p>

<p>No school can hide behind the we don't use them so we can't report them tactic.
They have complete authority to ask students to report the scores after admission but before matriculation. I would be curious how many of the parents or students following this debate have personal knowledge of a student who applied to selective colleges but took neither the SAT or ACT. I know lots of folks don't subment them but how many never took them.</p>

<p>dstark,
Your comments on PA are right on the money. Finally, I agree with one of YOUR posts!! :)</p>

<p>You can still group the schools without rankings.</p>

<p>You can use size of the schools, geography, wealth of the student body, amount of applications, etc. There are many ways to group the schools.
You can group the schools using SAT scores. ;)</p>

<p>d:</p>

<p>So, what you're suggesting is that US News simply provide a listing of colleges along with the data from those colleges? And this would be different from any other thick book with data organized by state or whatever in what way? And why would I buy that book over one of the other books?</p>

<p>I will repeat. As long as I know the methodology, the ranking is useful. I can quickly see that, say, Northwestern must be fairly high in the factors used for the methodology and that, say, the University of Alaska Fairbanks is not so high. Then, I can look at the numbers.</p>

<p>The organization is useful.</p>

<p>"And why would I buy that book over one of the other books?"</p>

<p>You won't. That's why the rankings exist.</p>

<p>Bingo! They exist because people want them organized that way. Supply and demand.</p>

<p>But that doesn't make the rankings accurate.</p>

<p>When did I say people don't want the rankings?</p>

<p>They do.</p>

<p>I used to love college football rankings.</p>

<p>But then I grew up. ;)</p>

<p>I think USNWR should add a feature to it's online version of the rankings that permits users to assign their own weights (including zero) or even to add their own criteria with quantitative measures so that folks could create their own rankings. They could even let the user decide how much to "punish" non-reporting or partial reporting schools. They would still have the USNWR rankings but you could see how much difference your own preferences made.</p>

<p>I believe Business Week is doing this with their online undergrad business school ranknigs.</p>