UNSWR May Change Methodology to include high school counselors in reputation index

<p>Ratings</a> Changes by Daniel Luzer | Washington Monthly</p>

<p>What</a> May Change in Upcoming College Rankings - Morse Code: Inside the College Rankings (usnews.com)</p>

<p>
[quote]
In addition, I discussed methodology changes that U.S. News is considering:</p>

<p>—We may add high school counselors' rankings of colleges as part of the academic reputation component, which is now 25 percent of the America's Best Colleges rankings. To do this, we would reduce the weight of the regular peer assessment survey for the National Universities and Liberal Arts Colleges categories.</p>

<p>—We are considering combining the scores from the current peer assessment survey rating done by college academics with the scores and high school counselors' ranking of colleges. That combination of scores could be called the "undergraduate academic reputation index."</p>

<p>—We are considering adding the admit yield—the percentage of students that the school accepts that enroll at that school in the fall—back into the rankings. Yield is a very good proxy for student views, because it's how much students value their acceptance from that particular college. If yield is added back into the rankings, it will be part of the undergraduate academic reputation index variable.</p>

<p>—We may slightly increase the weight of the "predicted graduation rate" that currently accounts for 5 percent of the National Universities and Liberal Arts Colleges rankings. The predicted graduation rate has been a well-received variable by some higher education researchers, because it measures outcome and rewards schools for graduating at-risk students, many of whom are receiving federal Pell grants.</p>

<p>—We are contemplating eliminating the Third Tier from all the National Universities, Liberal Arts Colleges, Master's Universities, and Baccalaureate Colleges rankings tables in print and online. We would extend the numerically ranking to the top 75 percent of all schools in each category, up from the top 50 percent now. There would still be the bottom 25 percent of each category listed alphabetically, and that group might be renamed to something like the 4th Quartile. We believe that the data is strong enough to numerically rank more schools, and the public is asking for more sequential rankings since it's less confusing than listing schools in tiers.

[/quote]
</p>

<p>With all the folks fighting about (who would have guessed it) Michigan and Berkeley, etc in the thread started about the metrics in UNSWR, I'm surprised on one seems to have posted about these proposed methodology shifts.</p>

<p>If they ask the counselors at our high school the UCs should zoom up in the rankings. Our counselors’ definition of Success in college admissions is getting offered a spot at a UC, any UC. </p>

<p>At our school if you want to shoot for HYPSM you are going to have to figure most of that out for yourself, since the Guidance staff has only general notions about how to pull that off. In fact one year one of the guidance counselors asked <em>us</em> for advice on how to get her high-performing kid into an Ivy League school (since she knew we had previously succeeded). Wasn’t it supposed to work the other way around?</p>

<p>Signs of desperation are starting to emanate from the Morse Code. </p>

<p>The more information becomes available from a variety of sources, the more it becomes obvious that while USNews did a laudable job in presenting the information to Joe Six_Pack, it also relied on a flawed methodology and encouraged the outright manipulation of data. </p>

<p>Morse knows he can’t continue to rely to a much maligned instrument such as the PA. Considering all the (well-deserved) negative publicity (read rebellion of schools and fraudsters being caught) Morse has to find new “blood.” The saddest part is that the added contribution of the GCs will probably make the index … better. And we all know how true Coureur’s account is. The one-eyed leading the blind!</p>

<p>As far as the other changes, how interesting is to see the return of the yield. Why did they drop it in the first place? Was it not because of hints of manipulation? Should we believe that the colleges have abandoned their schemes to boost the yield figure? </p>

<p>And how about raising the expected graduation rate index? A well-received metric? Yes, for sure! And also one of the most flawed and poorly designed in the entire table. Of course, rewarding the lower admission standards and easier graduation paths offers such a great correlation to … quality of education. Start will lesser students and graduate them through low standards and cheap education is such a contribution to better education!</p>

<p>How pathetic!</p>

<p>

Well, I think the argument could be made that graduating students above the expected rate indicates good performance by the university. In my opinion, it’s one of the few ways one could actually evaluate a university as a whole.</p>

<p>

</p>

<p>This would almost certainly render Tufts Syndrome more problematic.</p>

<p>This idea is ridiculous. HS counselors mostly assist with college applications but are not actually involved inside universities. I think this is an attempt to make the rankings harder to game for schools that are trying to move up, because counselors are less familiar with many schools than college faculty and thus their idea of what is prestigious or of quality will be more fixed.</p>

<br>

<br>

<p>It will also boost BYU’s ranking, since it recently overtook even Harvard in yield.</p>

<p>^ Really? I suspected that BYU would have a high yield, but I didn’t know it was that high.</p>

<p>^^Yes, according to the current figures BYU has the highest yield of any national university, slightly ahead of Harvard. Of course they also have a very self-selecting applicant pool, which is one of the weaknesses of using yield.</p>

<p>

</p>

<p>Noimagination, may I ask you if you have researched how this index is developed?</p>

<p>

I’m not defending this precise implementation of an expected graduation index. I just don’t agree that it’s a bad idea in theory.</p>

<p>Rant warning…</p>

<p>In the last USNWR they asked the high school counselors which colleges they thought offered the student the best education. When will USNWR wake up and ask the people with the most at stake?? The students, of course. </p>

<p>And if they want to make it relevant to outcomes, ask employers. They can give you an earful on how well (poorly!) most colleges prepare their graduates for post-college life. </p>

<p>Academia costs so much and spends so much time patting itself on the back. How in the world are these people so unaccountable? And how do they perpetuate grading systems where they’re the graders of which institution is great while another is merely good? Great? Good? For who? For them or for the student? </p>

<p>USNWR-just give us the data and get out of the way.</p>

<p>

I actually like both of these ideas, but they would have to be done well. To my knowledge, there are no existing legitimate implementations of these ideas.</p>

<p>Although I am not much concerned with rankings, I sincerely hope they do not include counselor’s opinions. I’m sure there are many knowledgeable guidance counselors, but they cannot possibly be aware of all top colleges outside of their region. Heck, I live in Pennsylvania and my counselor had never heard of Carnegie Mellon and thought that only the valedictorian could get into Penn State-UP. I feel this addition would poorly affect the rankings.</p>

<p>In terms of yield, I think it may be a good addition. CMU’s ranking would go down, however.</p>

<p>How about only counting the votes of counselors of schools that send at least 25 percent of their graduating classes to Tier 1 Unis+LACs/a select group of schools?</p>

<p>

Isn’t picking “a select group of schools” the theoretical purpose for rankings?</p>

<p>

</p>

<p>Problem is, that would be disproportionately pulled from well-to-do public and private schools, as well as from areas where there is not a strong state flagship that “siphons off” lots of top students. For example, in much of the midwest, the state flagships are seen as excellent choices to get in, and siphon off students who would otherwise be going to the top schools. A graduating class that sends most kids to Michigan or Illinois or Wisconsin might be equally as strong as the graduating class from the public in New Jersey, but in your metric they wouldn’t be counted. </p>

<p>I don’t see how most hs counselors are in any position to judge the quality of universities anyway.</p>

<p>

</p>

<p>I agree. The guidance counselor idea should not make it to the actual rankings.</p>

<p>It would work in the sense that these rankings are targeted toward fairly well-to-do people who live in fairly well-off areas. And, really, one would have to be relatively privileged to entertain oneself with such size-measuring contests.</p>

<p>

I agree. How are they qualified?</p>