Are public universities hurt or helped by USNWR methodology?

<p>USNWR Rankings are laughable.</p>

<p>A higher graduation rate determines who good a college is? Is that a joke? That might as well be a term for how much grade inflation is going on.</p>

<p>Only a couple things are needed.</p>

<p>Peer Assessment (25%)
Graduating Student Assessment (25%) - Their opinion of the school.
Initial Student Quality (25%)
Cost (25%) - Absolutely massive factor that no one looks at. There is zero price competition right now at top ranked schools.</p>

<p>There are actually several other major college ratings. There are differences amongst them. Some schools do better and some do worst. My university actually does the worst in USNWR. In all the other ratings, it tends to break into the top 100, but in USNWR, it is a third tier school. My school does poorly in USNWR because there is no grade inflation, in some regards, student opinion, and low alumni giving.</p>

<p>Regarding post #41</p>

<p>"A higher graduation rate...might as well be a term for how much grade inflation is going on"</p>

<p>While that might be true at some top-ranked universities where over half the kids are in the honors programs, that is definitely not the case at slightly lower-ranking schools, where graduation rates are a good indicator of the programs the school has in place to help kids who are having problems and who might be potential dropouts as a result.</p>

<p>For example at my son's university (ranked in the 50-100 national university range), the graduation rate is higher than the norm primarily due to writing workshops, tutoring centers, requirements that including writing-intensive courses at the very beginning of the college experience, freshman induction programs (to help the student adapt to the new environment), smaller than the norm freshman English composition classes (average freshman class size is 50 students on an overall basis, but all freshman English classes are limited to 21 students), a counseling program that allows walk-in immediate attention, on-site medical service 24/7 with a major hospital within close access, separate counseling on an anonymous basis for alcohol/drug concerns or financial issue or problems with roommates--as well as many other individualized attention types of handling programs, safety kiosks across campus for summoning immediate police response (we all know how important that can be now, don't we), etc.</p>

<p>I think that a graduation rate can be an indicator of these types of programs--programs which are a positive for many students (and their parents), when considering possible schools to attend.</p>

<p>You don't expect your child to have problems--but you want to know that the school will have programs available the child can turn to in the event such problems arise.</p>

<p>bsb2007,
Smart kids and not so smart kids don't make decisions based on the USNWR rankings. Frankly, I"ve yet to meet the student or family who has. But they do influence how schools are seen from various perspectives (students, families, faculty, alumni, employers, the public at large), so I don't think it is appropriate to just ignore them. I try to understand the methodology used and how it may help or hurt a college or a group of colleges. As I posted earlier, I think that publics are net beneficiaries of how the rankings are presently being performed.</p>

<p>calcruzer,
Great posts. USNWR does perform a good service of providing a ton of data in one place and it is generally easy to retrieve and sort according to variable that might be important to you. It can be helpful in figuring out what schools are potential Reach/Match/Safety schools. But as we both know, USNWR is a first step in the process. Once the dozen or two of schools have been put on an initial list, then the real evaluation begins and the issues of fit predominate.</p>

<p>Mr.Payne,
Loved your suggestions on cost and student input. Hated your comment on Peer Assessment. I think that PA is the single most corrupt number in the entire rankings. </p>

<p>Re alumni giving, I agree that this hurts publics generally but some state universities still do a pretty good job with this (Georgia Tech is 24th, U Virginia and W&M are 33rd, North Carolina is 45th). Still, even if it were removed from the rankings, I think that some overrate its impact as it is only 5% of the rankings. It would take a lot more than the removal of this to move needle and push publics up 10-15 spots in the rankings. I don't think that this elimination of this would even move them 2-3 places.</p>

<p>
[quote]
Mr.Payne,
Loved your suggestions on cost and student input. Hated your comment on Peer Assessment. I think that PA is the single most corrupt number in the entire rankings.

[/quote]
When I say peer assessment, I think this number needs to be tailored to a department or specific college within the university. I'm not sure this can be done accurately. If you don't measure peer assessment - how do you measure the quality of the graduates? No other method seems very good at that (other than requiring graduates to take a standardized test upon graduating).</p>

<p>
[quote]

When I say peer assessment, I think this number needs to be tailored to a department or specific college within the university. I'm not sure this can be done accurately.

[/quote]

We do have an example of what that might look like. USNWR uses peer assessment in its law school rankings, which focus on a specific college in the university. The PA is tailored to law schools and in addition there is an assessment from legal professionals. Still, not everyone is pleased and there are some that feel the methodology does not accurately reflect the strenghts of their school.</p>

<p>IMO, the combination of Financial Resources (10%) and Alumni Giving (5%) should be worth less than half of the combination of Standardized Test Scores (7.5%), Top 10% (6%) and Acceptance Rate (1.5%).</p>

<p>I would reallocated as follows:
Peer Assessment (20%)
Standardized Test Scores (12%)
Top 10% (10%)
Acceptance Rate (5%)
Financial Resources (3%)
Alumuni Giving (0%)
Others - not sure (partly because of lack of understanding)</p>

<p>IMO, this whole discussion of public vs private is going to change over the next few years as tuition fees become more and more beyond the reach of most middle income families. We're probably going to see state universities become increasingly competitive as more and more bright students opt to attend them. At this time, the USNWR rankings are interesting to read, but should not be the final factor in college selection. IMO, the rankings should be considered with some skepticism....</p>

<p>I have some thoughts about PA but here's the easy things first:</p>

<p>Alumni Giving Rate: "The percent of alumni giving serves as a proxy for how satisfied students are with the school." I have no idea why they ever decided to use "Alumni Giving Rate" as a measure of student satisfaction rather than some direct measure of satisfaction. That's a terrible proxy of satisfaction. That's a measure of charity and wealth, not satisfaction.</p>

<p>Faculty Resources: Have you guys ever looked at what's behind this rank? It's silly, the biggest chunk of it, a whole 35%, is based on faculty salary. The name "faculty resources" misleads casual readers into thinking that it relates to the resources available to faculty for academic purposes. But no, this is just salary.</p>

<p>You could argue that higher salaries go to more talented people, but I don't buy that in the academic world. Especially when you consider declining marginal product... the extra salary boost that some schools might give is trivial. I'd rather compare Nobel Prizes and real output rather than input. Analogy: Just because the Yankees have a massive payroll doesn't mean that they're outperforming other teams by a proportional amount. In fact they've been mediocre. This is a huge bias against Publics, as we all know that top privates are always able to offer more money. Here's some average annual salary data from the American Association of University Professors:</p>

<p>(School, Professors, Associate Profs, Assistant Profs)</p>

<p>Harvard, 177.4, 100.0, 91.3
Princeton, 163.7, 105.0, 79.1
Yale, 157.6, 87.1, 77.9
Stanford, 164.3, 114.7, 91.0
Berkeley, 131.3, 86.8, 76.2
UVa, 128.0, 87.7, 71.1</p>

<p>Do you guys really think that these salary variations should determine 7% of a university's ranking?</p>

<p>The Alumni Giving Rate and the Faculty Compensation components make up 12% of the total score. That's absurd, and it's definitely enough to keep publics down at least 5 spots, especially when you realize that these two components are ones where publics are lagging the most.</p>

<p>Freshman retention rate and alumni giving rate are indirect measures of satisfaction. One group is voting with their feet, the other is voting with their pocketbook. I'm not sure how good either measure is as. It seems to be included because it is an easy number for universities to provide and says "something" about the institution. What I'd like to know is why are freshmen leaving and why are alumni giving?</p>

<p>When I graduated prep school many years ago the headmaster gave us a little speech saying how important it was to contribute a little something to the annual fund, even if it was only $10. He explained that the alumni giving rate was an important measure for outside groups evaluating the school. This was a few years before USNWR started their rankings. I suspect the historical use of alumni giving rate was adopted by USNWR for similar reasons. I don't know how relevant it really is since an institution has 50 year's worth of alumni whose view of the school varies greatly. Some alumni are quite engaged and others just send in their check every year. As for recent graduates (newer alumni) the giving rate is rather low. This is the group that has the most recent experience with the university, but with loans to pay off combined with a starting salary, or loans still accumulating while in grad school, there are not many "for" votes coming in.</p>

<p>ucbhi,
Can you or anyone else provide the information behind the components of the Faculty Resources and Financial Resources of the USNWR survey? </p>

<p>Re alumni giving, it probably is a decent proxy for graduate satisfaction from LACs and many private schools. However, as applied to state schools and particularly those with large student bodies like UC Berkeley and U Michigan, this is a penalizing number despite the fact that graduates of those and many other state schools feel passionately and positively about their school. The question is, if the measure (alumni giving) has some validity for a large number of private schools, and publics wish to be compared to privates, should all schools should be subjected to the same measurements? </p>

<p>I suspect that if USNWR ever breaks the publics out into a separate ranking, the alumni giving measure would be eliminated. Still, I think the animus directed toward alumni giving is probably excessive as AG is the only USNWR variable that hurts publics and its 5% weighting is not that injurious. By contrast, many privates (and particularly those outside the Education Establishment in the South and Midwest) are hurt by more heavily weighted and/or poorly designed considerations (Peer Assessment, Top 10% students, 6-year vs 4-year graduation rates).</p>

<p>
[quote]
Smart kids and not so smart kids don't make decisions based on the USNWR rankings. Frankly, I"ve yet to meet the student or family who has.

[/quote]
</p>

<p>Have you a sense of how representative the families you've met are? I believe there is increasing concern over the weight of rankings in student decision-making--based not just on anecdotal data, but on actual student responses to surveys like CIRP. </p>

<p>
[quote]

Can you or anyone else provide the information behind the components of the Faculty Resources and Financial Resources of the USNWR survey?

[/quote]
</p>

<p>it's on page 79 of the 2007 issue.</p>

<p>Faculty resources overall count 20%. Making up that metric are: </p>

<p>(1) Class size, as measured by the proportion of classes with fewer than 20 students (30%) and the proprtion with 50 or more (10% of the score; obviously figured negatively)</p>

<p>(2) Faculty Salaries (average pay plus benefits, adjusted for regional COL differences) (35%)</p>

<p>(3) Proportion of professors with highest degree in field (15%)</p>

<p>(4) Student-faculty ratio (5%)</p>

<p>(5) Proportion of faculty who are full time (5%)</p>

<p>Financial resources (10% overall) is less detailed in description; US News says that it measures the average spending for the two prior fiscal years on instruction, research, student services, and relationed educational expenditures.</p>

<p>
[quote]
AG is the only USNWR variable that hurts publics

[/quote]
</p>

<p>Has that been established?</p>

<p>hoedown,
Thanks for the info. </p>

<p>Re AG as the only variable that hurt publics, that is my opinion. But the point of the thread is to unearth such thoughts from a variety of posters. If you or others have other areas of the methodology that you believe favor or hurt publics, then please raise them. Increasing understanding is the objective.</p>

<p>As for families and their views on rankings, my perception is that the academic community hates the rankings and wants to demonize them generally, if not specifically. However, I think that media portraits (and academic viewpoints) underrate the intelligence of the consumers. I also think that the media and the academics represent the status quo and they are scared to death of any widely used rankings that might upset that.</p>

<p>actually, alumni giving was added to the rankings criteria which bolstered the rankings of the private schools. While it may be a tangential proxy for "graduate satisfaction," it is a better proxy for wealth of the student body who matriculate as Frosh. </p>

<p>Cal and every UCs purposely accept ~33% low income kids (Pell Grantees), so expecting their families to donate money is ludicrous, regardless of their passion for the school. (In contrast, UVa only recently found out that it has less than 10% Pell Grantees.)</p>

<p>Not only do the Pell Grantees hurt the alumni giving portion of the UCs, but is also negatively impacts the graduation rates. The UCs are not particularly generous with finaid, so Pell Grantees need to work part-time during school. As a result, they matriculate under an assumption of five year plan, if not longer. OTOH, the UCs are extremely generous with AP credit, so any financially-able student who wants to graduate in 4 years can easily do so, with the possible exceptions of Engineers who want to take an elective or two, and Architecture students. Also, unlike privates where finaid only last four years (better earn a degree at get out), the UCs will continue finaid for more than four....</p>

<p>btw: less than 50% of Calif high schools rank their students, so the UC Top 10% number is "estimated." I once asked a UC admissions counselor about it and she just smiled.</p>

<p>I cannot presume to speak for all academics, but it is certainly my perception from discussions on academic forums and at conferences that rankings are not well-liked. Especially rankings from "outside" for-profit organizations like US News. To that extent I agree with you. I am less confident attributing emotional motivations such as fear, desire for status quo , etc. to that reaction.</p>

<p>As for students and parents, unfortunately I don't see the same evidence you do that they aren't used. I don't feel this is a matter of underrating intelligence. Nationally, over 35% of freshmen enrolling at highly selective private institutions say that rankings in national magazines were "very important" in their choice. That number drops to a quarter for highly selective public universities, but I still think this is meaningful.</p>

<p>hoedown,
My post may have come off as more definitive than I intended as I agree that rankings play a role, perhaps even an important one, during the college process. My point is that I think that the vast majority of consumers intelligently use the information that makes up the rankings. I have not seen the student or family who really says that X school is ranked 20th and Y school is ranked 25th and thus X must be better than Y. Generally, I have seen/heard the rankings used as a way to compare across different types of schools and geographies and make statistical comparisons and for identifying schools that are comparable (or not) to one another. </p>

<p>Frankly, the only way in which I have seen the USNWR rankings abused is that some will campaign for Peer Assessment scores as the true determinant of academic strength and they will then promote that score as the way to choose between schools. Ugh!</p>

<p>One big difference between private schools and public universities is the publics tend to be more variable in their different programs. The University's College or Department of X may be extremely good and well-known, while their College of Y may be eh? The Department of X has all the high SAT scorers, while the College of Y is full of slackers. The University average SAT score tells you nothing, in this case.</p>

<p>So the USNWR info has some use, but you have to do your own research in the areas you are interested in to discover the real bargains in public schools.</p>

<p>Quoted from comment by LBP: "IMO, this whole discussion of public vs private is going to change over the next few years as tuition fees become more and more beyond the reach of most middle income families. We're probably going to see state universities become increasingly competitive as more and more bright students opt to attend them."</p>

<p>I am worried about any ranking that relies mostly on the quality of the students attending a school. This tells me nothing about the quality of the TEACHERS or how accessible they are to students. IMO, this is what is wrong with the public high schools in my area. The teaching and classwork is not particularly good and the competition for grades is intense, and yet my son's high school consistently ranks highly because of standardized testing. I realize that high quality students do allow for presentation of material at a higher level so this measure can't be completely eliminated, but it should not be allowed to dominate as it currently does in high school rankings.</p>

<p>I didn't address that piece of it (why I may not agree that alumni giving is the only ranking factor that isn't beneficial to publics) but I'll chime in now--I think mommusic is right about SATs and other academic rank factors. </p>

<p>Public universities may feel compelled to offer a range of programs in service to their state, some of which may be less selective or which recruit and admit students based on talent in addition to academic factors. </p>

<p>Also a factor in SAT and GPAs is that public institutions are not as free to shape their class the way they wish, or in ways that maximize quality measures like those USNews uses. They may have legislative controls (or unwritten agreements) that they will fill a certain percentage of the class with residents. Access matters, too. I recall Berkeley's list of the below-1000 SAT people they admitted several years ago. I'm sure they didn't do Berkeley's stats any favors, but I had no doubt they deserved their shot at the state's flagship.</p>

<p>for the most part i think they are in their just rankings
i think the top publics that are on it are deserving of there place except for maybe Cal should be a bit higher</p>