USNews Peer Assessment

<p>In terms of theories about why graduation rates are rising, I'd like to throw in the stupendous rise in tuition costs and increasing parental pressures to finish in four years.</p>

<p>But the question in terms of rankings is: why does it matter that a college misses or exceeds some arbitrary expectation set up by USNWR?
The information I would want is whether students do not graduate in six years because classes are so overcrowded that they can't get into the ones they need.</p>

<p>The peer assessment ranking is the reptillian brain of the USNews poll; it hasn't changed in 20 years, and, in fact, was the original germ of the poll in the first place. The fact that the peer assessment numbers have been so static (and so infinitesimaly close) over the years is yet another reminder of why the poll has become so complicated over the years: without the bells and whistles that get tweeked every year, the editors would have less control over the results, the rankings would never change, and the magazine would sell fewer copies.</p>

<p>I wonder why we're spending so much time trying to make sense of the hocus pocus that the USNews rankings represent? You've identified some of the key "fudge factors" that they've built into their unscientific formulas in order to engineer the rankings to their liking.</p>

<p>There are some other instances where they take reported data at face value when they have no reason to do so. One is the reported SAT's for schools that don't require them. Another is the reported student/faculty ratios, which are sometimes absurdly low.</p>

<p>The ignoring of "outcome" data (other than graduation rate -- accompanied by its own fudge factor) remains a huge problem. The very fact that this is run as an "annual" horserace rather than looking at longer-term stats and outcomes is also totally out of synch with what a college, a faculty, and a program represents.</p>

<p>The reasons are that we are smart folks who want to be able to shriek about this methodology and to defend our opinions to the legion of uninformed who quote this ranking to us.....our family, friends, school counselors, kids...we want to be able to understand and know what we are dealing with. Knowledge is power.</p>

<br>


<br>

<p>Yeah, the fact that they rank the schools so often makes me question any big jumps or drops. When some school drops 6 places, as UC Davis did this year, how did that happen? Short of having half the campus burn down or the school losing a large chunk of its endowment in a Ponzi scheme, how much can things change in a single year?</p>

<p>My ideal data-driven college guide system would have no overall rank whatsoever. Instead it would expand the good part of the USNEWS On-line edition approach -- sortable columns for individual data points such as median SATs, graduation rate, etc.</p>

<p>I would signficantly add to the number of sortable columns to include relevant datapoints like:</p>

<p>Per student endowment</p>

<p>Number of undergrads</p>

<p>% of classes in each of the 7 sizes on the Common Data Set</p>

<p>% of white/US students</p>

<p>% of students in each non-white category</p>

<p>% Pell Grant</p>

<p>% qualifying for financial aid</p>

<p>% of grads getting PhDs</p>

<p>% of grads in law, biz, and MD prof schools</p>

<p>Binge drinking rates</p>

<p>% frats</p>

<p>% varsity athletes</p>

<p>% public versus private high school</p>

<p>Provide the data in as many areas as possible (inc. the name brand recognition survey currently known as Peer Assessment), let people see where colleges lie and how to prioritize the various datapoints. </p>

<p>All colleges will be high on some of the sortable lists, low on others. It forces the consumer to confront the real choice -- what "style" of college am I looking for? These lists would cut through a LOT of the glossy viewbook BS and cut down on the extreme lack of self-selection that occurs when students pick their colleges from some arbitrary overall ranking.</p>

<p>For example, it is just ridiculous that nearly half of Harvard's applications come from students with SATs below 1400 and that more than 85% of Harvard's apps come from students who weren't even ranked #1 in their high school class. This is de facto proof that large numbers students aren't even bothering to research their college choices. The overall ranking systems a huge culprit in this.</p>

<p>I like interesteddad's idea for additional sortable columns. I would add the f/m admissions rates.</p>

<p>Interesteddad wrote:

[quote]
Binge drinking rates</p>

<p>% frats</p>

<p>% varsity athletes</p>

<p>% public versus private high school

[/quote]
</p>

<p>Wow. Just wow.</p>

<p>I would add the f/m admissions rates</p>

<p>Definitely. Actually, grad rates broken down by gender and race are very interesting, too.</p>

<p>Wow.</p>

<p>The majority of the categories I mentioned are already reported on the Common Data Set, including the % frats and % varsity athletes. The rest of the data is available to the schools, although they don't always want it publicly known.</p>

<p>Many of the categories I listed would illuminate clear differences between schools that may appear superficially similar on some overall ranking list.</p>

<p>I would include a faculty/student ratio using only tenured faculty who teach at least two undergraduate courses per semester. Might even give a few extra points for faculty who publish no more than once a year.</p>

<p>
[quote]
I would include a faculty/student ratio using only tenured faculty who teach at least two undergraduate courses per semester.

[/quote]
While I would look carefully at the reported faculty-student ratios, I think this approach would be too extreme. There are faculty are some major universities who do not teach two courses of any kind per semester. Their teaching loads are lower than that. I also wouldn't penalize a school if it has highly research productive faculty. But I agree with the sentiment that it's very misleading to use faculty-teaching ratios if a fair number of faculty mainly or (in some cases) exclusively teach graduate students.</p>

<p>Inasmuch as I do not like the "rankings", I need to recognize that the US News provides a wealth of information. The online reports are a bargain at $14.95. The searchable columns that I-Dad would like to see expanded are indeed remarkable and effective tools. I'd also like to mention that some of the information discussed by I-Dad is actually available to students who spend the time to select schools and arrange the date with the comparative tools. In therory someone could run the selection (limited to 6) and have a superb comparison spreadsheet. </p>

<p>In fact, I probably should stop decrying the US News reports. The success of the magazine at selling millions of the printed version is what finances the tremendous tools offered online. I wonder what the ratio of online members to print purchasers is, but I'd think it may be in the HYPS admit rate. :) So, in the end, they have a product that sells because of a small dose of sensationalism and caters to America's passion with rankings. US News rankings may not be much better than the People Most Sexy Human, but the fortunate result is that we have gained access to information that would not be as easily available without the rankings. In this regard, I believe that schools that insist in limiting the access to their admission data will earn scorn and a healthy dose of scrutiny. </p>

<p>In fact, US News SHOULD add a category ranking the transparency of the schools.</p>

<p>The data and sorting at the US News site is extremely valuable -- well worth the $14.95. Back when my son was applying to colleges - and the sortable table was available for free - that is ALL we looked at. His top choice colleges were no where near the top of the "rankings" - but they kept coming up on the top of sort criteria that was important to him.</p>

<p>US News has a "compare" feature that allows a printout of comparable data for up to 6 colleges. D. just asked me to print out the info for the dozen or so colleges now on her short list. Certainly saves on communication time between mom & d. - I brought her the printout & explained what to look for in the financial aid data, and now I know that d. and I are on the same page in terms of what to expect in terms of colleges. One of her safeties fares surprisingly well on the data comparison charts in terms of financial aid, class size, student/prof ratio etc. -- and also happens to have programs ranging from adequate to strong in areas of interest for her. So I think that as long as you are knowledgeable enough to TOTALLY IGNORE the "rank" AND the "peer assessment" score -- and focus mostly on the hard data - it is a wonderful tool. </p>

<p>The fact is that "peer assessment" is just the cheat mechanism that US News needs to make the "rankings" come out as expected -- if they just relied on hard, statistical data, Harvard & Princeton don't come out on top. The more they include data related to actual experience on campus - the more the deck would shuffle with unexpected results. So its a big fudge factor -- adding in subjective make-weights to keep the objective data from dictating results you don't like. </p>

<p>That's not to say that one school isn't "better" than another - no one doubts that Princeton is a better college overall than Drexel -- but the problem is that we don't need US News to tell us that. We know where the colleges fall generally along the spectrum - it doesn't matter whether Harvard or Princeton or Yale is the 'best' -- they are THE SAME in terms of overall quality no matter how you cut it. Just as there is no qualitative difference between UC Berkeley & U of Michigan. But the bottom line is whether a particular college is a good fit for a particular student -- and for that the data is important. </p>

<p>I've found another site that is a great resource for data and "rankings" - but very subjective, almost entirely from student surveys. It's called <a href="http://campusdirt.com%5B/url%5D"&gt;http://campusdirt.com&lt;/a>. Their salary-predictor along with ROI (return on investment) figures are a real eye opener. It actually gives very specific college "report cards" and ranking figures. I'd take it somewhat with a grain of salt ... but I do see a value in throwing in factors like available of parking and quality of dorm food, along with quality of curriculum, in evaluating the college. After all, the kid has to live there. And personally, I think the opinions of students (they have surveyed 80,000) - may mean a little more to my kid than "peer assessment." What does my kid care what the president of Swarthmore thinks of her college? Isn't it more important what a recent grad thinks?</p>

<p>
[quote]
Might even give a few extra points for faculty who publish no more than once a year.

[/quote]
</p>

<p>It's an interesting idea, but I think you'd find a lot of people arguing against this. It's not as if there is a proven correlation between publishing more than once a year and poor teaching. Infrequent publishing is not necessarily a sign of great teaching or dedication to students.</p>

<p>What I would like to see is for a student/subscriber be able to apply their own percentages to each of the criterea and have the system spit out a personalized list. It should not be that hard for USNews to add this feature because they have the data.</p>

<p>There was a study done by Carnegie Communications that actually polled students on what percetnages they would apply to each of the categories. The rankings changed a good amount for some of the schools. This was primarily due to they peer assessment becoming only 5% of the equation.</p>

<p>Here is a link to the free download PDF of their study:</p>

<p><a href="http://www.carnegiecomm.com/resources/pcform.html%5B/url%5D"&gt;http://www.carnegiecomm.com/resources/pcform.html&lt;/a&gt;&lt;/p>

<p>Another ranking:

[quote]
The first 10 schools on the Washington Monthly's list of 200 national universities are MIT, UCLA, UC-Berkeley, Cornell, Stanford, Penn State, Texas A&M, UC-San Diego, Pennsylvania and Michigan in that order.</p>

<p>On a separate Washington Monthly list of 200 liberal arts colleges, the first 10 schools are Wellesley, Wesleyan (Connecticut), Bryn Mawr, Harvey Mudd, Fisk, Amherst, Haverford, Wofford, Colby and Spelman.</p>

<p>Their list uses the percentage of students in Army or Navy Reserve Officer Training Corps (ROTC), the percentage of graduates in the Peace Corps, the percentage of federal work-study grants used for community service projects, the total amount of research spending, the number of doctorates granted in the hard sciences and, as a measure of social mobility, the percentage of students on Pell Grants, with a bonus for schools whose graduation rates are higher than expected for having so many low-income students.</p>

<p>"What really did in Princeton were mediocre scores on national service and social mobility, categories in which it should have excelled," the article says of the new list's results. It says "Harvard has the lowest percentage of Pell Grant recipients in its student body of any school in the country."</p>

<p>

[/quote]

<a href="http://www.washingtonpost.com/wp-dyn/content/article/2005/08/21/AR2005082101009.html%5B/url%5D"&gt;http://www.washingtonpost.com/wp-dyn/content/article/2005/08/21/AR2005082101009.html&lt;/a&gt;&lt;/p>

<p>This is an interesting ranking and should be viewed along side all others as another piece of the whole picture. But, with respect to the comment:</p>

<p>"What really did in Princeton were mediocre scores on national service and social mobility, categories in which it should have excelled," the article says of the new list's results. It says "Harvard has the lowest percentage of Pell Grant recipients in its student body of any school in the country."</p>

<p>I wonder if explanation might be, in part, that the Princetons and Harvards are in a position to be so much more generous than most schools with outright grants.</p>

<p>
[quote]
I wonder if explanation might be, in part, that the Princetons and Harvards are in a position to be so much more generous than most schools with outright grants.

[/quote]
</p>

<p>Possible, but unlikely. The institutions would be foolish to turn down federal money to replace it with their own institutional dollars. In Harvard and Princeton's case, they could afford to, but why would they? The Pell Grant is the U.S. government's way of helping make college possible for the poorest families, and the dollars follow the student wherever they choose to go. Also, given that "% of Pell grant recipients" is used as a measure of economic diversity by outsiders, the colleges would be hurting themselves if they deflated those percentages by having their poor students turn them down.</p>

<p>The Pell Grant is so low right now and yes it is a "marker" but the amount awarded is so much less than the private grants from the college that it just won't work without the endowment funds. Pell has fallen behind in the ability to help low income students participate. For high achieving low income students who can get private grants all is well but for the rest it is not so easy.</p>