How would your ranking system be?

<p>A lot of us disagree with US News & World Report's rankings because we don't think that their methodology. Perhaps if we could create our own ranking system (and then at the very last step actually rank the universities), it would be more interesting. </p>

<p>The way that USNWR does it:</p>

<h2>Ranking Category Subfactor Weight</h2>

<h2>Peer assessment survey: 25%</h2>

<p>Student Selectivity total: 15%</p>

<p>Acceptance rate: 10%
High school class standing—top 10%: 0%
High school class standing—top 25%: 40%</p>

<h2>SAT/ACT scores :50%</h2>

<p>Faculty Resources total: 20%</p>

<p>Faculty compensation: 35%
Percent faculty with top terminal degree: 15%
Percent full-time faculty: 5%
Student/faculty ratio: 5%
Class size, 1-19 students: 30%</p>

<h2>Class size, 50+ students: 10%</h2>

<p>Graduation and retention rate total: 25%</p>

<p>Average graduation rate: 80%</p>

<h2>Average freshman retention rate: 20%</h2>

<h2>Average educational expenditures per student: 10%</h2>

<h2>Average alumni giving rate: 5%</h2>

<p>Source: <a href="http://www.usnews.com/usnews/edu/college/rankings/about/weight_brief.php%5B/url%5D"&gt;http://www.usnews.com/usnews/edu/college/rankings/about/weight_brief.php&lt;/a&gt;&lt;/p>

<p>I'm curious to see how different we would make it.</p>

<p>A quarter of the whole think is about peer assesment. I wish they gave more information on it, but just the fact that "about 57 percent of those surveyed responded" is dissapointing because thats a pretty small percentage. </p>

<p>I don't understand why they give a percentage for "Class size, 50+ students", and I think the weight given to that should be added to the student/faculty ratio. </p>

<p>Why is the Faculty compensation so high, as opposed to the "Percent faculty with top terminal degree"? </p>

<p>I'm not sure what I think about the Alumni giving factor. They say "The percent of alumni giving serves as a proxy for how satisfied students are with the school", but there are so many assumptions with that...the rich thend to donate more...those with MBA's or masters or graduate degrees usually have more money to donate, so should they include those with undergrad degrees that didnt go further with their education? That wouldn't be fair either for colleges that have a lot of students that continue their education. I guess because of the complications, I wouldn't give this much weight to alumni giving. </p>

<p>I think there should definately be consideration for High School Class Standing-top 10%, as that would help seperate the more competitive students and make the ranking more precise. </p>

<p>What do you guys think?</p>

<p>and silence. I would have thought that all those people bashing rankings would have had more to say.</p>

<p>I've been thinking about it - the faculty resources section seems pretty oddly configured.</p>

<p>while a few things may be weighted differently than many people, myself included, would, the USNWR is in fact the standard -- and part of that is because it is a very good ranking system, imo. there are a few schools that people would place differently, but as a general rule the tiers are correct, and the "misplaced" schools are very close to one another. with a grain of salt like always just like anything else, it's a good methodology -- especially for "overall" school, not just undergraduate education, graduate education and the like, but also university impact.</p>

<p>To borrow some from others:</p>

<p>Rankings would include the school's dedication to research, creation of new knowledge, dedication to student education, intellectual commitment and atmosphere, and how it values ideas.</p>

<p>It is the intense, strenuous, and constant intellectual activity of the place that would be an essential component of the weighting. The rigorousness and comprehensiveness of the curriculum would also have to be considered as well as the quality of, and access to, faculty.</p>

<p>I think that the debate really ought to be about what we want the rankings to show. If we want it to reflect the education students are getting, then certainly factors like student-faculty ratio and class sizes ought to be weighed more heavily, but if we want to be judging the institution itself, then I see factors like salaries and endowments playing a larger role. I will look more closely at it and post later on what I feel a fair and balanced (no pun intended) ranking system would be.</p>

<p>Endowments and such are a tricky business to evaluate. Is it per student or overall, is it unrestricted vs. restricted dollars, does it take into consideration if the school is public (those $$$ typically spent by endowments provided by state), cost of operation per student in an area vs. endowment per student, and those are just a few off the top of my head. Salaries would also have to reflect cost of living, I'm not sure one can pay a Stanford faculty member enough to outweigh the cost of living in Palo Alto (though there is Univ. subsidized housing). Others have looked at faculty in such organizations as the American Academy of Sciences as an indicator, which would put schools like the University of Washington far ahead of most of the Ivy League. If it is the number of CEO's hired per year by the Fortune 500, University of Texas and U of C would be tied for the current number 1. Is it surveys of student satisfaction with the academic and social life? A recent survey put Harvard at the bottom of the current so-called top 30 schools (talk about a place in need of reform). If it is name recognition, then Harvard, UTexas, UMichigan, Duke, and UCLA probably are top ranked.</p>

<p>All rankings are all going to be flawed and should be discontinued altogether. Let each school explain itself and why it might be the fit for the student and let it go at that.</p>

<p>The USNWR weightings form the rankings are completely arbitrary. In an environment focused on real research questions, one would approach the issue in a very different way (although when I describe the way, you may realiize what USNWR really does...).</p>

<p>Essentially, one takes a measured outcome value, such as, for discussion purposes, 4 year graduation rate. This is the dependent variable. Then one uses the rest of the data as predictor, or independent variables. Through statistical analysis, (multipe regresson, factor analysis etc), one determines which predictors matter, and how much. the "how much" is the weights. the book "early admissons game" made good use of this approach, and discusses it a bit.</p>

<p>So, here's what I think USNWR did. They had rankings in mind, and then developed predictor weights to give them the rankings they wanted. They really had (and have) no other choice. If their rankings deviated too far from popular perception, the rankings would have been dismissed as flawed, irrelevant etc., expecially w/r/t the top schools.</p>

<p>Chicago? it's just caught up in the whole process.</p>

<p>One of the items that would be interesting to see would be hospitalization for substance OD's. </p>

<p>The HS class rank factor is muddied by the consideration that many many schools (esp. prep and magnate) don't report rank.</p>

<p>I believe there is an article published by two USNWR staffers that described the methodology used, and it was pretty close to newmassdad's guess at how they did it. </p>

<p>I'm opposed to rankings even though Chicago probably benefits from them more than it is hurt.</p>

<p>how about a student happiness poll, or something like that? Wouldn't that be potentially important in evaluating an undergraduate institution. Something simple like, if you had to choose again, would you have gone to ....</p>

<p>The real killer in the rating business is that data is collected that would really matter in decision making, but is either kept private or released too selectively to be useful. Examples:</p>

<p>Surveys of student satisfaction. USNWR published some of this data a few years ago. Sorry I don't remember the title. But it was only for about 30 colleges, and none of the elites (no surprise...)</p>

<p>Alcohol abuse. A Harvard School of Public Health collects and studies alcohol abuse in detail for a number of colleges. Problem is that the schools will only cooperate with him if he keeps the identiy of the school confidential. They will only allow him to publish aggregate data even though he generates reports for individual campuses.</p>

<p>The last issue above I found most disappointing. It was my first introduction to the crass disinformation used by many schools in their control of messaging.</p>

<p>We parents and our kids are the losers.</p>

<p>Why doesn't it change? For the same reason a lot of folks buy Mercedes Benzes in spite of their dismal repair record and horrendous cost of ownership. Far too many people are buying pestige, image and bragging rights, not educatons. And the rankings are great for them.</p>

<p>This survey (posted elsewhere as well) taken by the Consortium on Financing Higher Education was leaked and was not volunatarily made public, one can see why.</p>

<p><a href="http://www.wtopnews.com/index.php?nid=104&sid=459269#%5B/url%5D"&gt;http://www.wtopnews.com/index.php?nid=104&sid=459269#&lt;/a&gt;&lt;/p>

<p>To be the unhappiest of these says something: </p>

<p>Amherst College | Barnard College | Brown University | Bryn Mawr College | Carleton College | Columbia University | Cornell University | Dartmouth College | Duke University | Georgetown University | Harvard University | Johns Hopkins University | Massachusetts Institute of Technology | Mount Holyoke College | Northwestern University | Oberlin College | Pomona College | Princeton University | Rice University | Smith College | Stanford University | Swarthmore College | Trinity College | University of Chicago | University of Pennsylvania | University of Rochester | Washington University in St. Louis | Wellesley College | Wesleyan University | Williams College | Yale University</p>

<p>The best way to evaluate a college is to subscribe to the student newspaper. If a school does not have a "free press", that is an independent student newpaper, that's a very bad sign. It means the message is entirely controlled by the administration!.</p>

<p>The Maroon provides all kinds of details about college life that the administration may not want prospective students to know -- crime stats, things to do, information on racial incidents, etc. By reading the paper over a period of time, you get a feel for the school, the relative importance of scholarship and athletics, the everyday concerns of the students (mold in the bathrooms, coffee shop closures, etc.) as well as the major issues on campus (military recruiters, views of the new president, etc.)</p>

<p>Most campus newspaper accept advertising -- check out who competes for the students attention, frat with keg parties, community event, and so on. </p>

<p>As I make a decision about the right school for me, I find the "primary data" in the school newspaper far more valuable than the secondary data in "insiders guides."</p>

<p>This is an interesting observation that has some merit. One must be aware, however, that editorial views, which may also help determine what is news and where it is placed, does change from time-to-time. Might be best to do this over a period of a few years.</p>

<p>After thinking about it, I made my own system I guess. </p>

<p>Instead of having the peer assessment survey be worth 25%, I think it should be 15% at most, as a large portion of people don't respond to it, and we don't really know what’s asked in it. Is this related to the relative happiness from the college? idad, I agree that it’s an important factor. Do you think it should be included in rankings, and if you do, how so?</p>

<p>I agree with the student selectivity for the most part. I think it should be worth 25% in total, but I think the HS class standing percentages should be like this:</p>

<p>top 10%: 100
top 25%: 75
top 50%:50
So basically if 70% of the class was in the top 10%, 20% were from the top 10-25%, and the 10% were between the top 25% and 50%, so basically they would get a 70+15+5= 90% of the allocated percentage, which I agree with USNWR (40% of the student selectivity total, which is 25% of the whole ranking). </p>

<p>I think the %faculty w/top terminal degree should be greater than the % for faculty compensation percentage, but that's only because I'm still not quite sure of the importance of faculty compensation. Anyway, for now I think it should be:
Faculty Resource Total: 15%
Faculty compensation: 25% (I guess...)
% Faculty w/top terminal degree: 50%
% full-time faculty: 25%</p>

<p>As for class sizes...I made this silly thing:
Total worth: 10%
1-5: 100
5-10: 90
10-15: 80
15-20: 70
20-25: 60
25-30: 50
30-40: 40
40-50: 30
50+: 20</p>

<p>Same goes for student/faculty ratio total.</p>

<p>I think both class size and student/faculty ratio should have 10% of the whole thing each. However, I'm not sure. Is class size more important? Because the important point is that the student can access the professor during and outside of the classroom. If the class is small, it more possible to do so. If there is a high student/faculty ratio, I guess the professors work more and it will be less likely to spend as much time with individual students. BUT, students would usually use individual help during the class rather than outside it, so perhaps class size deserves a bigger % than the student faculty ration. </p>

<p>I think average graduation rate is important, but I'm not sure how much we should give it. 20% seems very big though. What does this measure? Isn't it more dependent on the student rather than the institution? Regardless, I don't think it should be worth more than class size and student/faculty ratio put together, like it is with the USNWR. At most, I'd say 10%, but I don't know. Why do you guys think it's important?</p>

<p>I'm satisfied with the freshman retention rate and average edu. expenditure/student values (5 and 10 % respectively), but I give 0% for alumni giving. </p>

<p>I think it would be interesting to analyze the student newspaper. However, it might be hard converting that data into rankings. But I agree that it is very informative and useful. </p>

<p>How would you guys suggest alcohol abuse come into play with rankings? I guess it could come in student satisfaction...</p>

<p>"All rankings are all going to be flawed and should be discontinued altogether"</p>

<p>This is why I was curious of how others would carry out rankings. What's important to me may not be important to you, vv. While I believe there are a lot of mistakes with some rankings, I still think the existence of ranks are important, but they are meaningless if they don't represent what we value, which is why I started the thread...I wanted to see what others care about as opposed to me, and how they would want USNWR to form rankings. Pointless? Perhaps. But I still have 11 days till my UChicago deadline...plenty of time to procrastinate :)</p>

<p>I've just had a change of heart. Rankings are stupid. I think they'll mean much more for grad school. </p>

<p>Yeah.</p>

<p>In light of the discussion we are having here, I think that this article will surely being a new perspective into the conversation - after all, rankings would never be so contentious if we all didn't have pre-conceived notions about higher education institutions. I find that this article deals with this subject masterfully:</p>

<p><a href="http://www.newyorker.com/critics/atlarge/articles/051010crat_atlarge%5B/url%5D"&gt;http://www.newyorker.com/critics/atlarge/articles/051010crat_atlarge&lt;/a&gt;&lt;/p>

<p>Also do note the references to the University of Chicago, which I am glad to say I think are quite accurate</p>

<p>Frankly, use of graduation rate in college rankings is worthless. How can that possibly be a reliable indicator of a college's academic strength? Note that this makes up 25% of US News's ranking. Colleges can creampuff their curriculums and raise their 6-year graduation rates to the sky, but that does not make them any stronger or more rigorous. Sure, graduation rates have some value in screening out truly awful institutions, but when we're evaluating schools in the top tier, they're counterproductive.</p>