Education Sector: New Ranking System on the table

<p>From the Education Sector webpage:</p>

<p>"Higher education is a complex endeavor. A rankings system can only succeed if it can reflect that complexity accurately and fairly, by combining information from a variety of sources. With the advent of NSSE, the Collegiate Learning Assessment, outcomes-based accreditation, and new data about graduation, employment, and life outcomes, that critical mass of data now exists. There is now enough information to create sophisticated rankings of higher education quality to replace the wealth-exclusivity-fame paradigm of the U.S. News rankings."</p>

<p>This new ranking paradigm proposes to put a bright spotlight on the issue of afforability and transparency in higher education and to provide students and parents "more useful information for choosing colleges". </p>

<p>"The new rankings would help shift the market dynamics from price to value. Value measures compare benefits to price. But students currently have little or no information about real benefits in terms of learning outcomes, and prices—particularly among private colleges that can charge what they like—tend to be about the same. The U.S. News college guide perfectly illustrates the current lack of real value measures in higher education. Under the heading of “Great Schools, Great Prices,” U.S. News lists the top five “best values” among national universities as Cal Tech, Harvard, Princeton, Yale, and MIT—five of the top seven overall universities absent price. The top five “best value” liberal arts colleges are Williams, Amherst, Wellesley, Pomona, and Swarthmore—also five of the top seven on the main list."</p>

<p>From what I read on CC, USNWR rankings are both revered and reviled. So the question is, do we need yet another ranking system, and if we do, is this the "one" we need?</p>

<p><a href="http://www.educationsector.org/supplementary/supplementary_show.htm?doc_id=404246%5B/url%5D"&gt;http://www.educationsector.org/supplementary/supplementary_show.htm?doc_id=404246&lt;/a&gt;&lt;/p>

<p>I can't help wondering at what point all this information just gets to be too much - and if a new project to reform the USNWR rankings will make any kind of dent in the prevailing system. In any case, I culled the following article on the new ranking system from the ES site (a great deal of the funding for this project is from the Bill and Melinda Gates Foundation).</p>

<p>"College Rankings Reformed:The Case for a New Order in Higher Education"</p>

<p>"The failure of the U.S. News rankings to provide colleges with incentives to improve the quality of their teaching is one reason why studies have found that many American collegians aren't learning what they need to know. In a recent report on college-student literacy, for example, the Washington, D.C.-based American Institutes for Research revealed that only 38 percent of graduating seniors could successfully perform tasks like comparing viewpoints in two newspaper editorials.</p>

<p>What the U.S. News rankings do, in effect, is confirm the status of colleges and universities that by virtue of their prestige are valuable to students irrespective of the quality of the education they provide. Students could get a rotten education at Harvard and Yale and they would still be ahead of the game because Ivy League degrees have so much cache.</p>

<p>But the vast majority of college students—almost 90 percent—don't attend selective colleges and universities. They attend institutions that don't have the status to open doors for their graduates on the basis of name alone. Instead, what matters to these students is the quality of the education that they receive.</p>

<p>Reinforcing the status of the nation's wealthiest, most famous, and most exclusive institutions has been lucrative for U.S. News and other organizations that rank colleges and universities. But they have not deliberately excluded measures that shed light on the quality of college teaching and learning. Rather, they exclude such measures because information that answers questions that would be most helpful to the most students—Where are students taught the best? Where do students learn the most? Where do students have the best chance of earning a degree? Where are students best prepared to succeed in their lives and careers?—simply hasn't been available.
... New research and advances in technology in the last several years have led to a host of new metrics and data sources that together offer an unprecedented opportunity to measure how well colleges and universities are preparing their undergraduate students. The new measures provide information about a range of important factors like teaching quality, student learning, graduation rates, and success after college. Many of them are eye-opening, suggesting that existing rankings badly mislead students and parents about the “best” colleges and universities. Some institutions currently mired in the lower reaches of the U.S. News rankings show outstanding results, while some of the exclusive institutions so prized by striving students don't live up to their reputations for excellence.</p>

<p>The wealth of valuable new information provides the possibility of replacing existing college rankings with a vastly improved ranking system. This report explains what the new measures can show, how those measures can be combined into new college rankings, and why the new rankings would benefit both students and colleges.</p>

<p>The new rankings would give students and their parents far more useful information for choosing colleges. They would create strong incentives for colleges and universities to take steps to improve their undergraduate instruction and reward institutions that have excelled at that task. They would bring two-year institutions more fully into the mainstream conversation about higher education quality. And they would even help address the problem of rising college costs.</p>

<p>In the long run, higher education would greatly benefit from the new rankings. They would give colleges and universities fair terms under which to compete and excel. They would help justify new public investments in higher education. And they would create a more dynamic, efficient market by giving students the ability to pick and choose the institutions that will actually serve them best."</p>

<p>So, where are the rankings?</p>

<p>Yes, I've already decided I can't possibly comment until I see whether they got my personal bouquet of schools "right."</p>

<p>LOL. Ain't that the truth. </p>

<p>When I read the proposed methodology and how it varied from USNEWS I was thinking that most of us were already doing that - or well, I at least sort of did a very minor version of that. </p>

<p>I would subtract a USNEWS column I didn't think meaningful to me (for whatever reason) and "recalibrate" the scale. In my head, not always on paper. Two similar schools . One ranked much higher . Oh, I see it's grad rate that pulls them down and I know that they lose 10% of the freshman class to party-hearty 'tudes first year. That doesn't bother me (academically), or at least I can live with it . Well if I drop that out it looks like they rank about the same.</p>

<p>As to those categories not just "improperly weighted " but missing from USNEWS , I would use NSSE data from those schools that fully released it (Centre) or try to find some other source. </p>

<p>I applaud their effort. Now, where's the rankings? ;)</p>

<p>
[quote]
I would subtract a USNEWS column I didn't think meaningful to me (for whatever reason) and "recalibrate" the scale.

[/quote]
</p>

<p>Graduation rates are somewhat informative as a distinguishing characteristic between tiers of colleges. Certainly, a difference between 97% and 57% is worth pondering (although it's often just a reflection of student wealth and initial selectivity).</p>

<p>Where USNEWS misses the boat is in trying to use it to make fine distinctions in a narrow tier of schools. If anything, when I look at grad rates in the 97% to 100% range, I start wondering if the school is simply pushing everyone thru to graduation no matter what. I cannot imagine a college that didn't have a few students who, for academic or behavioral issues, would be better served by leaving. The rapid increase in grad rates at elite colleges over the last couple of decades suggests this may be the case.</p>

<p>Two columns that I would add to the calculations are some measure of diversity in the student body and the surveyed binge drinking rates. Like anything else, you have to view these datapoints in context, but they would both provide a useful piece of the puzzle about campus culture -- something that is missing from the USNEWS rankings today.</p>

<p>At first glance, this sounds great, but I have some serious doubts. First, how is it possible to measure outcomes? Where would the ranking organization get information on jobs and salaries of graduates? How would they be able to measure value added? The only reliable measure would be standardized tests like the GRE's, but only a small number of students take them. I also don't like the high emphasis on graduation and retention rates. Those measures largely reflect the quality of the incoming students and not the quality of the school. I doubt any of these innovative ideas will be applied to generate new rankings. Who is going to do these rankings and how will they be able to generate income to pay for the endeavor?</p>

<p>
[quote]
Where would the ranking organization get information on jobs and salaries of graduates?

[/quote]
</p>

<p>Not to mention that focusing strictly on vocation training aspects of college would serve to undermine the core values that have made American higher education so successful. It seems to me that a college where few, if any gradutates, go off on unpredictable tracks would be a dull place indeed. We get to see quite a few graduates of a one of the highest ranked universities in the country take off to China or Africa or the far corners of the globe after graduation -- for experiences that would likely not register on some quantitative "outcomes" analysis. But, those experiences are, in large part, what make the graduates of that university so interesting.</p>

<p>Seems to me that the two things that really matter are 1) student perception of academic quality and quality of life; and 2) value added. NSSE gets closer to that than anything else that I know of.</p>

<p>All rankings are nonsense, and as jmmom suggests, its just whose nonsense does one most agree with.</p>

<p>I think that there is some value in the kind of information rankings provide, but it has to be used carefully. The USNWR rankings for super-elite colleges are a bunch of hogwash. The kind of data it collects are useless for distinguishing between Williams and Amherst, or Harvard and Yale, and trying to make such distinctions is a fool's errand anyway. (We actually had to stage an intervention last year to get a close friend to stop pressuring his daughter to choose Williams over Amherst mainly because it was ranked higher in USNWR.) On the other hand, the data does alert people to anomalies: schools with lower retention rates than their peers (which can mean failures, or can mean net transfers out), or schools with lower or higher endowments. That is not a bad thing, and aggregating such information into rankings can help consumers, and help universities, too, by stimulating improvement. The sizzle is in seeing whether Harvard or Princeton won the phony race this year; the steak is in identifying real distinctions much farther down the list.</p>

<p>Some kind of outcomes measurement could also be valuable, but even if you could get complete, meaningful data, a lot of what you would be looking at was a picture of what kind of student a school attracted 20 years ago. Do Harvard grads make more money than Penn State grads? Almost certainly, but the difference will be a lot less if you look at Penn State grads who were accepted at Harvard, or even Penn State grads with Harvardian SAT scores and GPAs. So does Harvard do a better job of educating kids than Penn State? I think so, but I doubt you could prove it with outcomes data.</p>

<p>And then there's what I might call the Penn-Brown issue. I would confidently predict that Penn graduates do much better in earnings than Brown graduates, on average, and I would also confidently predict that the differences (if any) between the two schools would be a lot smaller if one disregarded the third of Penn students at the Wharton School. Of course, if your goal is to be a real estate mogul, you may well want to pay attention to the fact that Penn has a great undergraduate business program, but the glossy earnings of Wharton grads shouldn't mean much to a prospective linguistics major deciding between the two schools.</p>

<p>The funding for this project came from Lumina Foundation in Indianapolis.</p>

<p>
[quote]
which can mean failures, or can mean net transfers out

[/quote]
</p>

<p>Not net transfers. Transfers out count against the grad rate. Transfers in do not count for it.</p>

<p>Really? Never focused on that . . . In other words, if a school starts with a class of 100, and 10 kids transfer out and 10 transfer in, and the whole class graduates, that's a 90% graduation rate? Seems a little off to me.</p>

<p>
[quote]
In other words, if a school starts with a class of 100, and 10 kids transfer out and 10 transfer in, and the whole class graduates, that's a 90% graduation rate?

[/quote]
</p>

<p>Correct. You take the students entering as freshmen and count the number of those specific students who graduate.</p>

<p>There are limited adjustments for students who die, enter the armed services, etc.</p>

<p>Here are the questions from the Common Data Set:</p>

<p>B4. Initial 1999 cohort of first-time, full-time bachelor’s (or equivalent) degree-seeking undergraduate students; total all students:</p>

<p>B5. Of the initial 1999 cohort, how many did not persist and did not graduate for the following reasons: deceased, permanently disabled, armed forces, foreign aid service of the federal government, or official church missions; total
allowable exclusions: </p>

<p>B6. Final 1999 cohort, after adjusting for allowable exclusions: (Subtract question B5 from question B4)</p>

<p>B7. Of the initial 1999 cohort, how many completed the program in four years or less (by August 31, 2003): </p>

<p>B8. Of the initial 1999 cohort, how many completed the program in more than four years but in five years or less (after August 31, 2003 and by August 31,2004): </p>

<p>B9. Of the initial 1999 cohort, how many completed the program in more than five years but in six years or less (after August 31, 2004 and by August 31, 2005): </p>

<p>B10. Total graduating within six years (sum of questions B7, B8, and B9):</p>

<p>B11. Six-year graduation rate for 1999 cohort (question B10 divided by question B6): _____%</p>

<p>Education Sector:
Core Operating Support
The Bill and Melinda Gates Foundation
The William and Flora Hewlett Foundation
The Ewing Marion Kauffman Foundation</p>

<p>Project-Specific Support
The Broad Foundation
The Carnegie Corporation of New York
The Annie E. Casey Foundation
The Joyce Foundation
Lumina Foundation for Education
Pisces Foundation</p>

<p>So, is this ranking project solely funded by the Lumina Foundation and is there any connection with The Washington Monthly? Seems there might be from the website...</p>

<p>What I always wanted to see in the online version of USNews rankings was the ability for me, the user, to apply my own weighting of the catagories and have the application re-sort the rankings based upon my preferred weightings.</p>

<p>Some people may think high SATs are important and weight those higher. Some peer assessment, others alumni giving. But most people probably would not apply the same weighting as USNews. USNews can still publish their magazine but we get a better tool for the sorting.</p>

<p>Then, over time, they can add more categories and data for all of us "consumers" to use in our sorting and discovering.</p>

<p>Eagle79, I was thinking the same thing.;)</p>

<p>The USNWR rankings have always bothered me. They're sort of arbitrary. I've never taken the time to remove a column and re-calibrate (and the fact that some of you have done that makes me feel better! My son thinks I'm OBSESSED with this college thing, but some of you are even more into it than I am, thank heavens). </p>

<p>With USNWR, the colleges have started playing games to make their numbers look better. When they included "% of alumni donating", suddenly my alma mater started emphasizing "every gift counts, no matter how small!" There are some who think that schools that make the SAT's optional are just trying to raise their average SAT score, since the kids with low SATs won't submit them. On the other hand, I've heard from kids who would NEVER apply to a school that doesn't require SATs, since it's "obviously not very selective."</p>

<p>Rankings are only valid to me in how they relate to MY kid. I know it's more important to find the school that fits my child best, rather than send my kid to the "best" school. I have a friend who transferred OUT of Harvard because she was miserable there! And yet at the same time, as much as I try to remember to find the best fit, there's a snobby part of me that wants him to go to a "top" school.</p>

<p>That said, I'd love to see someone come up with a recognized alternative to USNWR.</p>

<p>Not only are they arbitrary, but USNWR will intentionally give schools that snub them really low ranks as a way of "punishing" them, a la Reed a couple of years ago. And I also happen to know somebody who left Harvard because she hated it there. Funny.</p>

<p>I think that ultimately rankings by their very nature will be counterproductive to people (students and parents both) making informed decisions about which college to attend. Whatever rankings systems are devised, they will inevitably bog down into a ****ing contest. Like somebody noted, the top schools are going to be good whether or not there is a ranking system to "prove" it to people. But in the long run, people are going to use any ranking system as a crutch to justify picking one college over another, when what they really should be doing is visiting the schools, talking to students, sitting in on classes, etc. Real research like that is the only way to really get to know the character of a school and how good a fit it will be for you.</p>