A Review of the USNWR Approach-What is Valuable?

<p>That is a good point. While USNWR is completely inaccurate at determining precise rankings, due to the fact that the data it uses is flawed, it is pretty accurate to within 10 or 20 places in terms of telling you which schools are best. The SAT ranges and selectivity rank are pretty accurate and you can just use those to rank schools. The problem is in the financial/faculty resources category and student/fac ratios which are horribly measured.</p>

<p>"I'm not convinced that a school ranked 40th is a poorer investment than one ranked 30th."</p>

<p>That's because it may not be. The ranking is not valid to within that degree of precision. If you look at the raw scores, there are three schools with near-perfect scores (HYP), then a gap and you have lower programs all clustered around raw scores of 91, 90, 89, etc. Those are statistical ties. The difference between a school ranked 30th and one ranked 45th might be the difference between a raw score of 85 and 83. There's really no difference there. I wish USNWR would communicate this more clearly in its rankings, perhaps by using a tier system instead of a numerical rank system.</p>

<p>
[quote]
Macalaster, Colby, Kenyon, Bates, Bowdoin, Furman, the list is almost endless of outstanding colleges that dont make the top 50 colleges in the United States.

[/quote]
</p>

<p>Friedokra, are you sure that the schools you listed are NOT on the list of the 50 best COLLEGES? </p>

<p>While I don't disagree with your sentiments about smaller schools, the USNews does not start and end with the listing of the first page of universities that offer doctoral degrees. Actually, I think that for many readers of the magazine, and especially suscribers to the online version, the first pages are provoking a yearly yawn and a mere glossing over the information that tends to be a quasi repeat of the prior years. On the other hand, the magazine is a great resource to understand or learn about the changes at schools such as Reed, Sarah Lawrence, or Richmond. </p>

<p>The value of USNews remains the clear presentation of a massive amount of data that may or may not be available from other sources. The least important parts are the ranking and the pretension that the data does reveal the best QUALITY in education. Astute readers have learned to recognize when the magazine is listing objective data and when it borrows more from People magazine or the National Enquired than from IPEDS. </p>

<p>USNews also remains a work-in-progress. It is obvious that the methodology used has changed a lot since its inception: gone are the days when the magazine relied solely on its "reputational peer assessment" and gone are the days when a change of director resulted in the yo-yo years for Caltech. Now, we see a magazine that seems to have (re)discovered a bit of a backbone as their delisting of Sarah Lawrence shows. </p>

<p>One has to hope that Morse and his cohorts might pay more attention to the discontent about the excessive use of subjective and poorly defined elements, especially since it can be heard from such disparate sources as this forum and the otherwise rudderless group of rebels "led" by the Education Conservancy.</p>

<p>I have a few comments about the USNWR ranking:
1. There is a difference between the quality of an education and the value of an education. The ambiguity of the PA mixes these two ideas--is the PA really an evaluation of the quality of instruction, or is it really just a proxy for "name recognition," or value? I would also suggest that alumni giving relates to perceived value, and not quality.
2. The rating system weights the various factors the same through all tiers of schools. This is not realistic-the factors a person would use to decide whether Princeton is "better" than Yale are not the same factors one would use to compare two large state universities. For example, it would be nonsensical to choose any of the Ivies over the others because of its graduation rate.
3. Faculty salaries are probably overstated as an important factor. How much will the salary influence an academic's decision on whether to teach at Harvard or at the University of Alabama? Thus, for example, while UT's great wealth may enable it to buy the best rare books, it will be less effective in luring top academics away from schools with better name recognition.
4. Finally, here's a suggestion: USNWR should develop a "College Exit Test" that it would get college graduates to take, and it could rate the colleges by how well the graduates do on the test.</p>

<p>I agree hunt - see my post from another thread:</p>

<p>The U.S. News rankings have been fatally flawed for quite some time now. The rankings award certain types of institutions by using statistical metrics that may not have any relevance to educational quality. People who study this have pointed out many fundamental concerns, including institutions even going so far as to attempt to boost their rankings in the short term (e.g., by taking out loans to build new facilities) even if it might make them go bankrupt in the long term.</p>

<p>A few concerns (note, a small portion of the text below is taken from informal personal discussions with University provosts, not directly from me):</p>

<p>-- Awarding ranking points for the percentage of faculty who are full-time and/or who have Ph.D.s, penalizes universities and colleges that have particularly strong or large arts or music-related programs, as faculty in those areas often have special term appointments and also often lack Ph.D. or equivalent degrees. In one case, an Midwestern institution developed strong partnership agreements with several local school districts and their teachers who participated in the design and delivery of teacher preparation programs. While U.S. News considers them "adjuncts," they can be seen as a strong asset for the program, even though they are part-time. They bring the real-world applications to the table to balance the theorists. </p>

<p>-- The point system measuring student to faculty ratios fails to take into consideration certain factors, even at large universities like Harvard and Yale, such as cases where faculty in various professional schools (e.g., business, law, medicine, architecture) may teach undergraduates yet are not counted as full-time “faculty resources” or towards the student to faculty ratio because their full-time appointments are not within arts and sciences. This is one of those areas that is done very inconsistently among institutions when providing info to U.S. News. </p>

<p>-- Measures of class sizes using percentages and particular cut-off points are easy to manipulate. More meaningful measures could easily be developed. Statisticians know that we can have 10 classes with 1 student and 10 classes with 39, for an average of 20, which in no way reflects the experiences of any of the students. </p>

<p>-- Awarding points based on faculty salaries punishes colleges and universities with strong arts, language or humanities programs, because science and business professors tend to draw significantly higher salaries on average just based on their virtues in the marketplace. Schools like Johns Hopkins, MIT, Cornell or Purdue that have a relatively high(er) percentage of their faculty teaching in the sciences get an enormous boost while other universities suffer. This system actually PUNISHES schools for hiring an additional English professor versus an additional biology professor. This also punishes schools with strong religious traditions, particularly Catholic schools where you may have members of the order who take very low pay as part of their vocation. The whole idea that the more you spend, the better the students learn is pretty absurd.</p>

<p>"-- Awarding points based on faculty salaries punishes colleges and universities with strong arts, language or humanities programs, because science and business professors tend to draw significantly higher salaries on average just based on their virtues in the marketplace. "</p>

<p>Not to mention law professors and medical school, dental school professors. I wonder who is policing what professors are being included or not included in these numbers. And even where some discipline is exercised I'm sure there are many individuals with appointments to a university's arts & sciences college as well as its med school et al, yet spend no time with undergrads.</p>

<p>Moreover, costs of living, and prevailing wage rates for nearly all professions, differ significantly across the country. US News does have some sort of cost of living adjustment I believe, but I don't know how well their adjustment corresponds to what my adjustment would be. I recently moved from the midwest to the Northeast, and a salary 2x higher than before does not even yield a break even in lifestyle purchasing power. Not when real estate costs are higher still, and every other item is also higher. This is the reality.</p>

<p>Monydad, very few if any of a school's dentistry/law/med professors would be included in USNWR's figures from what I can gather, but you still raise a valid point. Overall the figures are just not calculated consistently, and as I explain above, certain types of universities are rewarded at the expense of others. Faculty salaries may be adjusted by cost of living indices but the indices used may not be accurate, either. No ranking is ever going to be 100% perfect, but some schools are certainly being punished because of USNWR's flawed methodology.... so take the rankings for what they are worth. Look at the raw scores, not the numerical rankings, and realize that a difference of 2 or 3 points in the raw score is completely insignificant. Use the individual pieces of data by themselves -- especially the useful ones, such as the selectivity index, and take with a grain of salt the student-faculty ratios or "financial resources" or "faculty" ranks, which are probably only accurate to within 10 or 20 rank places. </p>

<p>For something like financial resources, a simple list showing endowment per student is going to be much more useful. You'll see there are only a couple of institutions with more than $1,500,000 per student in endowment - Yale and Princeton - and that most of the "elite" schools hover around $300,000 per student or less. If one has $310,000 per student and another has $290,000, that's definitely not a deciding factor -- consider those tied. But if one has $310,000 and another has $110,000, that's a fairly significant difference that will likely be reflected in the quality of programs at that institution. For something like "faculty resources," instead of relying on a student-faculty ratio number that may be very inaccurate, calculate for yourself the number of students in the most popular departments and compare that to the number of faculty or number of courses offered.. or better yet, talk with a couple dozen students and faculty about the quality of education and types of instruction at a college you are considering. Obviously there is a huge gap between a 20:1 ratio and a 6:1 ratio but the difference between a 6:1 and a 7:1 or 8:1 might be a rounding error or that certain faculty were not included in the total despite the fact that they teach undergraduates.</p>

<p>
[quote]
4. Finally, here's a suggestion: USNWR should develop a "College Exit Test" that it would get college graduates to take, and it could rate the colleges by how well the graduates do on the test.

[/quote]
</p>

<p>Why have USNews do this? I'm a little leery of letting USNews decide what's important for a college graduate to know. There are instruments out there already, carefully developed.</p>

<p>As far as I can tell, USNews does not adjust for the proportion of science faculty. So the example of places with large science, or perhaps economics, faculty getting rewarded, even if those professors are not particularly well paid within their fields is possible.</p>

<p>However, since faculty salaries are published elsewhere in surveys that account for department, and I have seen confidential data in detail, I can say that the relative ranking on faculty salaries does pretty well match the USNews faculty resources ranks. This implies most universities are not sneeking in professional school faculty in their reports.</p>

<p>There is no way to objectively quantify what is essentially a subjective assessment of a quality education...and thus "rankings", no matter what statistical analysis you apply, is always fatally flawed. For even if you look at the student body alone, in sat scores and gpa's, even if its clear that Princeton has the top 10% of kids and Mississippi State something significantly lower, the "quality of education" is not just a measure of how smart the kids are in class.</p>

<p>Talk to 100 kids a 100 different colleges and you will rarely find someone disenchanted to the point of beating up on the school they attend for the quality of classes. If they are unhappy its usually for some other reason. Most kids say, "my school is amazing. The profs are incredible. and other superlatives."</p>

<p>Elitists will always separate themselves from the rest of society. I am not suggesting that Mississippi State is on the same par with Princeton, so before anyone jumps on me for that one, I was merely making an anecdotal reference and analogy to suggest that "amazing" professors can be found almost everywhere, regardless of the sat scores of the incoming classes. Indeed, a lot of Ivy League graduates become those professors at some off the beaten path schools.</p>

<p>Prestige is often a measure of the ultra selectivity of a school, the cost of tuition, and sometimes the smaller size of the students (and thus a low student/teacher ratio which presumably means a better classroom environment for learning...but not necessarily so.) </p>

<p>Some people go to Ohio State, Michigan, Oklahoma, Texas, UCLA or other prestigious state schools despite their gargantuan size...and would not consider going anywhere else. And they also have amazing professors and programs and a LOT of kids who scored VERY high on the sat and were valedictorians or in the top10%.</p>

<p>So to answer the question, "what is the best school in the country?" you really have to say, "which one?"</p>

<p>I meant smaller size of the student body...not the students themselves. sorry. my bad. lol.</p>

<p>"Why have USNews do this? I'm a little leery of letting USNews decide what's important for a college graduate to know. There are instruments out there already, carefully developed."</p>

<p>I was kidding about this, mostly.</p>

<p>Sorry, I'm far too literal sometimes. :)</p>