<p>For the top 50 or so schools, all they have to do is publish the % of full-payers and their median income, and they can junk the rest of the rating system (they'd get virtually the same result.)</p>
<p>Stickershock: My point is that I completely believe that the "20-100" schools deliver an excellent education -- but with the abnormal fixation of the raw rank, you'd think that school "99" was pretty terrible or that "25" is many leagues below a top five school. </p>
<p>Padad: when I was a HS senior in the 80s, I rec'd an unsolicited Yale app due to my high PSAT scores (I believe). I don't believe it's new that Yale (and others) have targeted the reliable pools of potential applicants. The fact that they use lists seems pretty basic to me. Yale actively searches for excellent students in under represented populations too (rural, URMs, etc.). They have the resources to do that and I don't find any incompatability with its mission but enhances it (IMHO). What has changed is what I perceive is society's abnormal adulation of the top ranked schools.</p>
<p>Bluebayou: I agree with you. It's easy for me to just point out the biggest "bogeyman" in the room. And for a single source of good info, the USN&WR is pretty good. I just am pained at the fixation with "lists" and "ranking" -- especially in the students and families I speak to. Like I said, I believe I'm charging at windmills...</p>
<p>The answer to post #34 is that they all represent people conspiring to protect their own interests at the expense of competition which benefits the general public.</p>
<p>This forum has had countless numbers of entries knocking the SATs and the USNWR rankings. IMO neither is perfect, but both are useful. My D did not do very well on the SATs so I guess I should have a tendency to discount the importance of standardized exams and point out the deficiencies. Actually, I think her scores did approximately reflect her level of academic achievement and potential. Maybe she should have spent more time preparing and she probably should have taken the SATs more than once. Would it have made a difference? Probably not, data from the College Board indicates that students taking the SATs a second time score about the same. I look at the rankings in the same way. The methods may not be perfect, but it is very useful to have a means of comparing the relative academic level of the huge numbers of colleges and universities.</p>
<p>It is also important to remember that both the SATs and the USNWR rankings are approximate. I consider the old 1600 SAT to be "accurate" within plus or minus maybe 50 or at worst 100 points. Almost every school has a 25-75 range of 200 points or more so the SATs are a useful, approximate guide both for students trying to select colleges and more colleges trying to select students. I am not sure about the "accuracy" of the USNWR rankings. For top 100 LACs or Universities, I would guess plus or minus 10 or 20. Again, as long as we do not overinterpret these measures, they seem useful.</p>
<p>I'm reminded of Ben Franklin's statement about democracy - that it's the worst form of government except for everything else. Rankings may have their problems, but how else do we poor parents find these "unnamed" but excellent schools? Not everyone uses it to discover the "top 10" - most of us know who they are without the rankings. Without USNews, I would never have heard of some of these wonderful colleges touted on this board (this was before I found this board). It is information.</p>
<p>Even the Fiske Guide and other guides "rank" - they give a number of stars to various items of interest, including academic rigor. Yes, I used those stars to compare schools. And they list the "best" values, the "best" public schools, etc.</p>
<p>We all rank things to some degree, and so do your children - "What's your first choice school?" As long as the rankings are not taken as carved in stone or some Holy Grail, they serve a useful purpose. Eliminate them, and what information do you propose replacing them with? Especially in selective college admissions, nature abhors an information vacuum; something else will pop up to give comparative data.</p>
<p>What did we all do in the days before USNWR began coming out with these rankings? I don't want to hearken back to "the good old days" when I attended college, which truly represented the "Dark Ages" because of the relative lack of information available. But there must be something filling the void between misleadingly specific ranking systems, on the one hand, and ignorance, on the other. </p>
<p>I think it might be useful to identify exactly which bits of information are relevant to students' and parents' decision-making, rather than those which describe the overall supposed "quality" of an institution in an absolute sense. Most of these are available in the College Data Sets, but not all. School websites provide quite a few more of these datapoints (e.g., depth and number of courses offered in each department, faculty size and research interests, variety of residential arrangements, etc.), but have to be "mined" for the information and are inconsistent in what they make available. Additional information that would be useful is available to enrolled students, but not usually to prospective ones (e.g., number of undergraduates majoring/minoring/taking classes in each department, students' ratings and comments about various professors and courses, scuttlebutt about which professors are coming and going and which departments/programs are slated for expansion and which for elimination, activities affecting esprit de corps among students of various majors, programs, living arrangements, etc.). Subjective data providing students' and parents' anecdotal impressions has the potential to be helpful in formulating thoughts about the "feel" of various schools. However, to be of much value the subjective information would need to be attained through random sampling, rather than through the self-selecting means used by the several websites (this one and others) currently serving as sources for this kind of data.</p>
<p>A compilation of information of interest from the perspective of prospective undergraduate students (rather than that of administrators, magazine publishers, status seekers, and other number-crunchers) that goes beyond getting in or attempting to summarize qualititative data in quantitative form would be very helpful. I, for one, would be very appreciative if colleges banded together to agree to create a standardized format directing interested parties to this information on each of their websites. Then, if some commercial entity wanted to collect this information, they would provide a useful service. We need to change the question from "How does everyone feel about the way USNWR et al. have been operating?" to "What do students and parents need/want to know, and how can they best obtain this information?"</p>
<p>I hope those of you who buy into the rants against college rankings know that you are being manipulated. Information is essential in creating a competitve marketplace for colleges and universities. Merit aid, another target of these rants, is a key way in which colleges buy a crucial element to a good education, other bright students. If these colleges want to create a competitive source of data, that's great. If they combine to attempt to limit colleges from participating in the USNWR data collection, they should and probably will be sued for illegal restraint of trade.</p>
<p>I don't think that most critics of USNews rankings have any problem with the free flow of information. You don't see much criticism of the National Center for Education Statistics, which provides at least as much information as the on-line version of USNews. The critique, mainly from within higher education, is that a single data point, the USNews's ranking--that pesky left-hand column--has become so important to application numbers, yield, bond ratings, alumni satisfaction, et al. that colleges feel they have to play the game or suffer. Similarly, colleges look around and see peers allowing the rankings methodology to influence decisions on educational policy (SAT-optional policies and ED admit rates are two obvious ones), and wonder what the costs of not competing on that front are.</p>
<p>It's hard to fault USNews for publishing a hugely successful special issue; USNew has no obligation to do anything but sell magazines. But, I also find it hard to fault college administrators for pushing back to the extent that they can, by deciding collectively, the only way it's going to happen, not volunteer information to aid an enterprise that they believe is flawed and works against their institutional goals (to put it generously) or against their self interest (to put it more cynically). </p>
<p>It would be interesting to hear from someone with legal expertise weigh in on similarities between this and the anti-trust action against the Overlap group in the early 90s. It seems like apples and oranges to me.</p>
<p>I don't see colleges doing much to provide any detailed information for prospective applicants as an alternative to the rankings. It seems that most college recruitment brochures are more like car ads. There are a lot of pictures of kids playing frisbee on the quad - even for those campuses which are covered with snow for the vast majority of the school year. At most they provide some of the favorable common data set information and feature some alumni. I also don't trust those colleges who want to avoid SAT ranges. Maybe the numbers don't matter for some of the quirky LACs. I do think they are generally important. The SAT ranges help to give an idea of the academic level of the student population. That can be very helpful in judging the level of competition an individual student will face and also gives an indication of the level of instruction.</p>
<p>Yale gives quite a bit of information- everything from legacies to state origins to popular majors to post-graduate activities.</p>
<p><a href="http://www.yale.edu/oir/open/index.html%5B/url%5D">http://www.yale.edu/oir/open/index.html</a></p>
<p>There's also schools like the College</a> of the Atlantic with its own messageboard and [url=<a href="http://www.mitadmissions.org/blogs.shtml%5DMIT%5B/url">http://www.mitadmissions.org/blogs.shtml]MIT[/url</a>] and [url=<a href="http://uncommonapplication07-08.blogspot.com/%5DChicago%5B/url">http://uncommonapplication07-08.blogspot.com/]Chicago[/url</a>] with admissions blogs for more subjective views.</p>
<p>Most USNews critics wouldn't have any problem with a USNews ranking for "highest SAT 25-75% range," or even "most selective." It's the overall ranking coupled with the title "America's Best Colleges" that's problematic.</p>
<p>An aside: surely one of USNews's statisticians could come up with a more sophisticated formula that would stop a lot of the SAT-optional rank-steering. The idea would be basically that an SAT-optional school that reports 80% of their scores gets a slight downward adjustment based on the lower scores that you can impute to the 20% of non-submitters. A college that submits only 50% gets their scores adjusted downward by a larger amount, and so on.</p>
<p>Here's another link to SLC's Prez</p>
<p>
[quote]
By Michele Tolela Myers
Sunday, March 11, 2007; Page B07</p>
<p>Like most college presidents, I have seen many prospective students and their parents show up on campus in recent months, clutching their well-worn copies of U.S. News & World Report's rankings issue. U.S. News has smartly tapped into students' need to sort out colleges and universities in a rational way. Parents, who face increasing college costs, understandably want to know where best to make that expensive investment.</p>
<p>U.S. News benefits from our appetite for shortcuts, sound bites and top-10 lists. The magazine has parlayed the appearance of unbiased measurements into a profitable bottom line.</p>
<p>The problem is that the U.S. News college rankings are far from reliable.</p>
<p>Turns out that some of their numbers are made up. I know that firsthand. Two years ago, we at Sarah Lawrence College decided to stop using SAT scores in our admission process. We didn't make them optional, as some schools do. We simply told our prospective students not to bother sending them. We determined that the best predictors of success at Sarah Lawrence are high school grades in rigorous college-prep courses, teachers' recommendations and extensive writing samples. We are a writing-intensive school, and the information produced by SAT scores added little to our ability to predict how a student would do at our college; it did, however, do much to bias admission in favor of those who could afford expensive coaching sessions.</p>
<p>Since we dropped the SAT altogether, we no longer provide SAT information to U.S. News & World Report. Our two years' experience with this practice has been very good. Faculty members report that our students continue to be terrific. Their average high school grades, high school ranks and grades in Advanced Placement courses have not changed.
[/quote]
</p>
<p>Here's the BEST way for US News to deal with this bunch of malcontents and iconoclasts: dump all of them in a new category called the Thacker Index -- where they'll be ranked with all the remaining institutions that play games. Led by Middlebury, SLC, and other SAT and correct data-hating schools, the TI category should be quite impressive, in its own way. Over time, that new category, which according to the EC snakeoil salesman might add to 570 schools, could become the largest category in US News.</p>
<p>Right now, the rebels want it both ways: provide incomplete information or "Vermont" data aka manipulated statistics in an attempt to boost their rankings AND do so in total impunity. The esteemed Prez of SLC has it right: "Turns out that some of their numbers are made up. I know that firsthand." Only problem is that the numbers are made up by the schools! The question that begs for an answer is,"why would a school that plays fast and loose with rules be listed next to the fair-players?" </p>
<p>The solution of separating schools that do not play fair is quite effective. Tell the schools they'll be listed in a separate category and listed by alphabet. After all, the schools should be happy to be mixed with their true peers. </p>
<p>However, the chances of that happening are slim to none. The schools that so criticize USNews are way too obsessed with the IMPACT of the rankings. Placing them where they belong is NOT what they are after!</p>
<p>xiggi, you know darn well that Myers would object to the alphabetical listing, as S falls rather late in the alphabet. She'd probably do what many businesses do to get first listing in the Yellow Pages: Just add a few As to the name. Pretty soon it will be a war to see who gets listed first. AAAAAAAAAA Sarah Lawrence? or AAAAAAAAAAAA Middlebury. Next year it's AAAAAAAAAAAAA Sarah Lawrence. And so on.</p>
<p>Good point xiggi. The data that schools submit to bond rating agencies is usually accurate because there's a legal penalty for being disingenuous there. Now there's an interesting source of data that doesn't get mentioned much on CC: the Chronicle of Higher Ed bond rating updates. They're interesting because they tell you what factors led to changes in bond ratings. </p>
<p><a href="http://chronicle.com/money/bondupdate.htm%5B/url%5D">http://chronicle.com/money/bondupdate.htm</a></p>
<p>On a side point: I think it's wrong to conflate schools that find the ratings troublesome and those that manipulate the data. There schools that are quite scrupulous in their reporting and who still condemn the rankings.</p>
<p>Stickershock, HAAAAAAAA HAAAAAAAA </p>
<p>You probably would see a lot of subtle renaming such as College of xxx instead of XXX College. </p>
<p>And, fwiw, I have to agree that "it's wrong to conflate schools that find the ratings troublesome and those that manipulate the data. There schools that are quite scrupulous in their reporting and who still condemn the rankings."</p>
<p>Schools that have discovered the weaknesses of the USNews methodology aren't interested in criticizing the model anymore. The fact that LOWER SAT scores might increase the overall ranking --by getting a boost in expected grad rates-- has probably escaped the SLC officers. In addition to analyze the impact of Middlebury's yo-yo reporting, SLC should study the reports of Wellesley, Smith, and other non-coed schools in greater detail, and then see what stratospheric SAT scores does to Pomona and Harvey Mudd. Dropping 200 SAT points might be the best thing that happened to SLC. :)</p>
<p>"An aside: surely one of USNews's statisticians could come up with a more sophisticated formula that would stop a lot of the SAT-optional rank-steering. The idea would be basically that an SAT-optional school that reports 80% of their scores gets a slight downward adjustment based on the lower scores that you can impute to the 20% of non-submitters. A college that submits only 50% gets their scores adjusted downward by a larger amount, and so on."</p>
<p>That is exactly the kind of misconception that SLC was trying to correct. My D applied to SLC last year and did not submit her SAT score (1580). She was attracted by the writing-empahsis of the school. The scroe was irrelevant and SLC never bothered to ask.</p>
<p>The theory of adverse selection would indicate on average the opposite would occur.</p>
<p>Marathonman88,</p>
<p>I like your adjustment procedure suggestion for USNWR. Less punative and more accurate.</p>
<p>
[quote]
That is exactly the kind of misconception that SLC was trying to correct. My D applied to SLC last year and did not submit her SAT score (1580).
[/quote]
</p>
<p>PaDad. I'd agree: Sarah Lawrence really is a special case, since they're not SAT-optional. It's hard to come up with a fair way to treat SLC that neither rewards nor punishes their lack of data. Either using old data or eventually dropping SAT scores from their selectivity index altogether seems like it wouldn't skew things by nearly as much as when an SAT-optional school gets away with effectively withholding the scores of the lowest 49% of their cohort!</p>
<p>Talk about the "good old days...." Let's see, I had a college advisor who knew basically nothing about any college except the local city colleges and SUNY schools, who thought that because I wanted to be a math major I should go to an engineering school (I wanted to be a theoretical mathematician for goodness sakes!), who completely ignored the fact that I wrote poetry and edited the HS literary magazine and who told my parents that letting me apply to MIT would be a "waste of the $25 application fee." There was no internet to use to learn about colleges, no money to use for visits ... just some books to leaf through in the college advisor's office while he glared at you and the public library where you could hand copy addresses and hand write a letter asking a college for more information.</p>
<p>I use the USNWR rankings all the time as a quick way to find the information I want all in one place ... but of course everyone is right that it is the one dimensional ranking that rankles....</p>
<p>I know the Princeton Review Counselor o matic gives very, very broad ranges of matches, but I always loved how you could keep slicing and dicing the matching different ways. I would LOVE if UDNWR just left off the single "overall" ranking number and let you pick different ways of ranking based on the data.... just as you can now rank by SAT score or retention rate or even peer ranking... and they could set it up so you could use some or all of those factors... and if you clicked all of them maybe you'd get the same overall ranking but it wouldn't be so OMNISCIENT!</p>