Another article about US News Rankings

<p>ok- saw a link to it in Huffington Post. Thought you may be interested. It was in Inside Higher Ed publication and hopefully this is the correct link!!</p>

<p>'U.S</a>. News' Adds Surveys That Could Alter Methodology :: Inside Higher Ed :: Higher Education's Source for News, and Views and Jobs.</p>

<p>Read!! Enjoy!!</p>

<p>I like the standard response suggested for the HS counselors. </p>

<p>I'm glad that my 2 daughters didn't give a hoot about rankings - it made the process much less stressful for our family. </p>

<p>I gleaned them out of curiosity, but the more I read about the methodology, the less interested I became. It would be great if they would just fade away.......</p>

<p>24 Arizona
27 Indiana
53 Massachusetts
42 Tennessee
43 Washington</p>

<p>The states and numbers above is how many schools ranked in the USNWR Top 1600 High Schools are found in these states. Why these states? Because they have roughly similar levels of population. My question is does anybody really expect the high school guidance counselors in Massachusetts to have much knowledge about colleges in Indiana or Arizona and actually be willing to rank those colleges over the local institutions that they know so well? Or vice versa. And so the states with the highest numbers on the Top High School list are going to have overwhelming influence in their voting. New York has 96 high schools on the list and 178 are in California. </p>

<p>Peer Assessment scoring by college academics is bad enough and should be discontinued or at least broken off from the rankings, but subjective rankings by high school counselors may be an even worse idea.</p>

<p>"Total endowment per student" & "admission yield" are two categories being considered as additions to the USNews rating & ranking system which are of interest to me, although I suspect that they will, in large part, support the current rankings. But they would be useful in reducing the weight (25%) accorded the Peer Assessment (PA) component of the USNews methodology. The concern with the PA ratings is that they are subjective & unverifiable, and may be based on a university's graduate school successes & proliferations rather than on the qualities of the undergraduate programs.</p>

<p>Some would argue that an SAT score is highly subjective as it reflects family education and income as well as test prep.</p>

<p>What a wonderful news! </p>

<p>Adding a group of mostly clueless individuals to the thousands who have admitted lacking the knowledge to offer valid opinions will obviously "save" the peer assessment. </p>

<p>What's next? Lining up 12 apes with darts in front of last year's report?</p>

<p>^^^ Lol!.......</p>

<p>I like this response.</p>

<p>"Thought You’d Want To Know
Here at the Manley Institute, we are very enthusiastic about this wonderful innovation by U.S. News and World Report ... and we have made a special note of it. The Manley Institute ranks news magazines once each year, mostly on the basis of (1) “reputation,” (2) several measures that are neither valid nor reliable indicators of readers’ knowledge of current world affairs, and (3) a mind-bogglingly naive objective function. This year’s top ten are ...</p>

<ol>
<li><p>Economist</p></li>
<li><p>Mother Jones</p></li>
<li><p>Maclean’s</p></li>
<li><p>Newsweek</p></li>
<li><p>Reason</p></li>
<li><p>Time</p></li>
<li><p>Insight</p></li>
<li><p>FrontPage Magazine</p></li>
<li><p>U.S. News and World Report</p></li>
<li><p>The National Enquirer</p></li>
</ol>

<p>Just thought you’d want to know."</p>

<p>"Total endowment per student" & "admission yield" are two categories being considered as additions to the USNews rating & ranking system which are of interest to me, although I suspect that they will, in large part, support the current rankings."</p>

<p>The first one supports the school that has the big arboretum to keep up and land in urban (read: eastern) areas, and may have little or no value-added to the student whatsoever. Or, put another way, as endowment per student increases, the value-added to the student from additional endowment decreases. (It's how you spend the endowment that counts.) At research-heavy state universities, for a student seeking to do research, value-added does not come from endowment, but from the ability of faculty to obtain outside grants that are specifically NOT endowment-additive. </p>

<p>Beyond the top 50-100 colleges and universities, admissions yield has mostly to do with the ability of a student/family to afford attendance, and almost nothing to do with quality. It's why the yield at some second-rate, non-flagship state universities is very, very high.</p>

<p>It sounds like a lot of high school counselors, especially at spendy private high schools, are embarrassed by how badly graduates of their high schools fare in getting into competitive colleges.</p>

<p>USNWR wants us all to believe that the information they solicit and compile is worth paying for. Much of what USNWR already publishes is available for free on sites such as collegedata.com, princetonreview, collegeboard, and College</a> Navigator - National Center for Education Statistics and U-CAN:University</a> and College Accountability Network - Free. Easy. . Since consumers are far less likely to pay for information that can easily be obtained for free, they rely upon the creation of proprietary information. The peer assessment data is the best example. </p>

<p>So now it seems USNWR is looking for other sources of proprietary information. It doesn't need to have real value. It only needs to have perceived value. USNWR is tapping another pool of perceived experts: HS college guidance counselors. I don't mean to disparage the work of GCs, but the insights they can provide are of limited value. What do they know about the outcomes of the students they send to different schools? (Are they even at working at the same HS 4 years later?) That is the area where information is scarce.</p>

<p>They should use the Cross Admit preference data... at least that is real money being played with. Rankings would not change much, except for a few (like WashU, Boston College and Emory) would drop at least twenty places, and some, like Amherst, would come in ahead of Penn, Duke, Cornell and Dartmouth.</p>

<p>

Haha...just as bad as the opinion survey conducted in 1995 to assess "faculty dedication to teaching", you happen to quote so often?</p>

<p>I gotta admit, I found USNWR pretty helpful for d # 2. She was your average HS student and was applying to a bunch of the SUNY colleges. As d didn't want to spend 4 years upstate NY, we did do a search of 2nd tier (for lack of better term) public U's in the mid-Atlantic area. The magazine format made it a quick and easy to identify schools that fit the bill.<br>
The magazine allowed us to pick out schools like Towson- West Chester- Virginia Commonwealth etc that may not have been on the radar of New York parents.
So I have always seen an upside to the USNWR college edition.</p>

<p>But what drives alot of us crazy is when we nitpick between the difference between a Cornell vs. U Penn or Williams vs Amherst. Come on- do we really need the magazine to inform us about each of these schools? There is enough information on these schools to make an informed decision. </p>

<p>And I wouldn't totally rule out the views of HS guidance counselors. Alot of families do stay in contact with the counselors as there is often a younger sibling still in the school system. GC do get feed back from their former students. The info is probably as useful as alot of the other stuff US News uses to make an assesment.</p>

<p>
[quote]
In addition, the magazine is asking a series of new questions of college presidents, having them identify “up and coming” colleges, inviting them to offer suggestions on changing the relative weights the magazine uses in its rankings, and giving them a series of possible additional measures and asking which should be added to the methodology.

[/quote]
</p>

<p>This should be good news for you critics of the PA score...</p>

<p>
[quote]
thousands who have admitted lacking the knowledge to offer valid opinions will obviously "save" the peer assessment.

[/quote]

It's fallacy to assume "thousands" admitted lacking the knowledge to offer valid opinions. If they didn't know, they were asked to indicate that. 49% did not return the survey out of over 4,000 surveys sent. I'd hardly say that they didn't complete the survey because they didn't know what was being asked...And it wasn't like those opinions were erroneously included in the results - they didn't return the survey, so no results were tabulated.</p>

<p>Let me rephrase my pejorative commentary: </p>

<p>"thousands who had the courage of admitting lacking the knowledge to offer valid opinions by ignoring the survey ... ."</p>

<p>And, UCB, I would not mind seeing a greatly expanded USNews' PA with countless columns for everything under the sun ... all the opinions of the provosts, the presidents, the long-nailed secretaries who really complete the forms, professors, the thousands of teaching fellows, even the chefs. As far as I am concerned, the entire USNews first twenty pages of the annual report could be exclusively devoted to a gigantic PA, and the mighty Berkeley could be printed in 14 points italics! </p>

<p>Then, of course, people who care about objective and verifiable assessments could start reading the pages that follow the main section. The PA zealots, on the other hand, could stop reading and simply bask in their newly created delusional grandeur. </p>

<p>At least, we could stop bickering about what matters or not, as we would be allowed to enjoy rankings that do NOT have to be rigged for the sole purpose of aiding public schools that might be dumped out of the top-30, or the first page altogether.</p>

<p>^ Well said and very poignant coming from a biased, pro-private school poster.</p>

<p>
[quote]
and the mighty Berkeley could be printed in 14 points italics!

[/quote]
</p>

<p>Berkeley</p>

<p>Just testing to see how it looks...:D</p>

<p>Ha, ha,</p>

<p>My son did not look at rankings at all. He read guide books, he visited and decided. He did not pick the highest ranked USN school he got into--didn't even know what the rankings were--he picked the one he liked the best. I'm glad he didn't drink the Kool-Aid. All of his schools were great, but why should he pick based on a list in a magazine?</p>