2018 US News Best Colleges rankings have been released

Here is a UC Irvine research paper on which universities rose in the US News Rankings between 1999 and 2007.

USC
Wash U

http://files.eric.ed.gov/fulltext/ED493830.pdf

UPenn’s rise predated this window
UChicago and Northeastern followed this window.

It would be interesting to see if UC Irving rose after this study…

For those who don’t believe Slate. the logarithmic adjuster is mentioned in the paper.

If schools were going to game the ranking, the easiest way to make the biggest impact would be:

  • Pick more kids with perfect test scores (average test score metric is worth ~8% of the overall score)
  • Exclude any kid who isn’t in the top 10% of their class. (percent of enrollees in top 10% of hs class is worth ~3%…)

Admit rate is worth just 1.25% of the overall score. US News puts much more stock in the quality of the kids who enroll than in the number of apps a school receives.

Anyway, test scores and top 10% are worth more than 10% of the overall score and could be manipulated fairly easily, assuming you have plenty of applicants (and presumably enrollees) with top-notch stats.

So becoming less holistic would be one fairly inexpensive way to inch your way up the ranking, assuming you didn’t simultaneously get worse in other areas…

lols… not only is USNWR gaming results… it was done for political reasons.

from the paper cited by @Mastadon

In the 2000 rankings U.S. News standardized all variables in its
ranking model, a procedure that catapulted CalTech into #1, displacing first-ranked Harvard,
Princeton and Yale. The following year, after a hefty dose of criticism from baffled readers,
including some from the displaced Ivies themselves, U.S. News adjusted each school’s research
spending according to the ratio of its undergraduate to graduate students and applied a
logarithmic adjustor to deal with “so-called statistical outliers.” CalTech (the statistical outlier)
was pushed back into fourth, and Harvard, Princeton and Yale were back on top.6

if you don’t like the results because of “outliers:” you can always make up new results with the USNWR logarithmic adjuster.

it’s more an exercise in politics and curve fitting for your desired results than anything else.

you can’t make this stuff up:)

I wonder why USNews feels the need to separate universities and colleges when they purport to be comparing undergrad experiences.

I don’t know about you, but my kids had both types of schools on their lists. Why the artificial separation?

(This came to mind when I looked at Tufts ranked among the national Us when it’s really much more like a LAC…then thought why bother making that distinction anyway? Just make one list).

@sbballer never worry about Cal Tech. For what they are and offer and how they are, they are absolutely unique and field leading. Hence their dominance of world university rankings. There is a strong argument that they are too sui genesis in numerous material ways to be compared to the other schools.

@Mastadon UC Irvine has been climbing, especially the law school since Chemerinsky has become their dean.

@OHMomof2 I would contend that there should be even more separation into different types of schools, rather than the two large categories that exist now. Different types of schools should be evaluated on different attributes. This is one of the ideas in Malcolm Gladwell’s critique of this largely useless ranking endeavor at

https://www.newyorker.com/magazine/2011/02/14/the-order-of-things

where he makes an analogy with trying to come up with a single ranking for automobiles. It just makes absolutely no sense to have a single ranking of cars, when some people need a sedan, some need an SUV, some need a truck, and some want a sports car. Why should a truck be evaluated primarily on its gas mileage, or a sports car primarily on its load capacity? There are useful evaluations of cars, such as those produced by Consumer Reports, but even they don’t try to produce a single ranking that fits all situations or needs.

A single list would imply that all colleges and universities are aiming (or even worse, should be aiming) for the same ideal. Even two lists downplay the importance of having a diversity of educational models, which is one of the strengths of intellectual and academic discourse that exists in this country. It does a great disservice to students, as well, who should be looking for a college that fits them. One or two lists cannot possibly meet the need for students to find the right fit for their own individual circumstances.

Ahhh, so:

Top STEM schools
Top Humanities schools
Top Social Science schools
Top Schools for Undergrad Focus (might want to just rank the universities, since it’s basically a forgone conclusion for LACs)
Top Schools for Future PhDs
Top Schools for Future Executives
Top Schools for Future Award Winners
Top Schools for Middle Management
Nerdiest/Preppiest/Crunchiest/Most Jock-Laden Campuses
Top Schools for Gainful Employment
Most Intellectual Atmospheres
Most Pre-Professional Atmospheres
Schools with the Smallest Classes, by Major
Schools with the Most Undergrad Research
Schools with the Most Undergrad Internships
Schools with the Best Party Scene
Schools with the Best Outdoor Activities
Schools that Spend the Most on Undergrads
Schools that Make it Easy to Change Majors
Schools that Offer the Best Financial Aid
Schools Ranked by Test Scores

This is starting to remind me of the various Princeton Review rankings. So yeah, if a kid had a specific fit variable that was of the utmost importance, niche rankings could help.

@vonlost “I suggest not taking the CC Top lists too seriously.”

I suggest not taking any list that purports to rank a diverse group of colleges and universities very seriously!

But since the whole discussion is about ranking, I was curious and I promise not to take whatever the reason is very seriously. Perhaps back when CC began, someone decided to use the heading “CC Top Liberal Arts Colleges” for the LAC that generated the most discussion or interest among CC users. So Trinity and Whitman were among those and Bucknell, Franklin & Marshall and Scripps were not, at least back then.

Maybe something like this:

  • rank just the ivies, so Harvard, Yale etc.
  • rank stem focused schools, MIT, Cal Tech
  • rank public schools by themselves
  • rank LACs as they are now
  • rank non-ivy privates

Then break up into groups of 10 and rank alphabetically within the groups so you don’t deal with the ties. It would be tougher to game as well since moving within a band would not be recognized.

"undergrad at a top 10 school, I would think given the preference a lot of the MBA programs give to their undergrad students, the ranking just based on that and their undergrad ranking would be as follows

  1. Harvard, Stanford
  2. MIT, Penn, UChicago
  3. Columbia, Yale
  4. Duke, Northwestern, Princeton"

Princeton doesn’t have a business school so they shouldn’t even be on the list, Dartmouth has an excellent business school, and Yale is not a top-10 business school. Swap out Princeton with Dartmouth, swap Yale with Berkeley or Michigan, move Kellogg into tier 2. This is how I would rank the undergrad:

Harvard, Stanford, Penn (Wharton)
Kellogg, Chicago
MIT, Columbia, Duke
Dartmouth, Michigan

And btw a lot of the top business schools prefer applicants that didn’t go to their undergrad.

Blah…blah…Yale is filled with snowflakes***
Blah…blah…Chicago is on the rise…
Blah…blah…Harvard will always be the standard
Blah…blah…Stanford really is the be all…
ARE WE DONE?

NO. WE ARE NOT DONE!!

He just moved to Berkeley.

“Political reasons”–how piquant! how pernicious! how–how–how bafflingly heteronomous! One almost forgets–in the wake of such perfidy–the QS’s and THE’s y/w/rankers whoreson shilling for (the whoremasters) Oxford and Cambridge, and the recent *inward motions/i leading all these great divers(e) Rankings Corporations (© ) to sustain the Harvard-Stanford-MIT cabal at the top. Presumed the “world’s best universities,” they are found to be the best in all things–from public service to innovation to social mobility to research to teaching to perfectibility to social ig(n)(m)obility to imperfectibility. Equally odd are the equestrians mounting the Stanford/Chicago hobbyhorses through and throughout these venerable fora.

I always find it weird how UW Seattle is always ranked so low in US News rankings. The research output at UW is one of the highest in the US but USNWR doesn’t seem to care :stuck_out_tongue:

@tonymom: “ARE WE DONE?”

Not until we’ve had a thorough airing of the Stevens vs. NJIT debate, including a rigorous analysis of which school has proven its superiority in statistical science by better gaming the USNWR ranking system.

UW has gone from 41 in 2011 to 56 this year.

http://publicuniversityhonors.com/2016/09/18/average-u-s-news-rankings-for-126-universities-2010-1017/

It was 41 or 42 from 2008 to 2012. It was 42 in 1997 apparently.

http://publicuniversityhonors.com/2015/06/13/u-s-news-national-university-rankings-2008-present/

UW’s enrolled student profile is similar to UCSD, SB, D and I (e.g., 26-32 ACT, 3.79 UW GPA), but those schools have held steady in the upper 30s to mid 40s range during that same period.

Research output and departmental strength are not part of the US News College Ranking methodology.

(http://publicuniversityhonors.com/rankings-academic-departments-private-elites-vs-publics/)

You will see higher rankings for UW in the THE (25), ARWU (13), NTU (6), CWUR (27), US News Global (11), etc. rankings, which include research output and impact as factors. I certainly enjoyed my graduate studies there many years ago: great city and surrounding area, beautiful campus, state of the art facilities, brilliant faculty and outstanding students at all levels and in all departments. Perhaps the upper left corner gets overlooked by those 70 guidance counselors who fill out the surveys. Maybe its grad rate or financial resources have not kept up with some of its peers. My understanding is that state appropriations cover less of UW’s budget each year; at the same time, the school is among the top recipients of federal research grants and private donations.

I have lived and worked near UCSD for almost 30 years and am always pleased to see it and the other UCs perform well whatever yardstick is applied. I consider the UW to be in the mix with UCSD, SB, D and I, and that’s how the kids in this area seem to think of it. Perhaps someone with access to the US News data could identify the ranking factors the UW would need to improve if it wanted to rejoin its cousins to the south in the upper 30s to mid 40s range. I should add, the UW administration does not seem too concerned about it.

http://www.seattletimes.com/seattle-news/education/in-u-s-news-rankings-washington-colleges-stay-steady-but-critics-raise-more-questions-about-lists-criteria/

The average graduation rate is one of the most heavily weighted factors in the USNWR national university ranking. About 80 universities on the Kiplinger “best value” lists (~60 private, ~20 public) have higher 4 year graduation rates than UW.

Research production is an important factor in several graduate program/department rankings.
The US News and Forbes undergraduate rankings don’t consider it directly (although it may have some influence on the heavily weighted US News peer assessment scores). Washington Monthly does consider “research”; it ranks UW 14th overall among national universities. However, the WM ranking doesn’t seem to get as much respect (or attention) among CC posters as the USNWR or Forbes rankings do.

@tk21769 US News uses 6 year graduation rate, not 4.

"Graduation rate performance (7.5 percent): This indicator of added value shows the effect of the college’s programs and policies on the graduation rate after controlling for spending and student characteristics, such as standardized test scores, high school class standing and the proportion receiving Pell Grants. U.S. News measures the difference between a school’s ** six-year graduation rate ** for the class that entered in 2010 and the rate U.S. News had predicted for the class. New this year, the proportion of science, technology, engineering and math – STEM – degrees out of the total degrees granted are included for the National Universities ranking category only.

If the school’s actual graduation rate for the 2010 entering class is higher than the rate U.S. News predicted for that same class, then the college is enhancing achievement or overperforming. If a school’s actual graduation rate is lower than the rate that U.S. News predicted, then it is underperforming."

Edited to add - even this metric is fuzzy. “vs the rate U.S. News had predicted for the class”. * ** How do they predict? * **

“New this year, the proportion of science, technology, engineering and math – STEM – degrees out of the total degrees granted are included for the National Universities ranking category only.” * ** What does this even mean relative to the graduation rate metric? * **

Washington Monthly also lists the 6 year graduation rate (83%, in the case of UW) and a “predicted” graduation rate (72%) for each school.

http://washingtonmonthly.com/2017college-guide?ranking=2017-rankings-national-universities

I do not know how US News predicts graduation rates, but this is how Washington Monthly does it:

“A college’s graduation rate (from the IPEDS) counted for 20 percent of the social mobility score, with half of that being determined by the reported graduation rate and the other half coming from comparing the reported graduation rate to a predicted graduation rate based on the percentage of Pell recipients and first-generation students, the percentage of students receiving student loans, the admit rate, the racial/ethnic and gender makeup of the student body, the number of students (overall and full-time), and whether a college is primarily residential. We estimated this predicted graduation rate measure in a regression model separately for each classification using average data from the last three years, imputing for missing data when necessary. Colleges with graduation rates that are higher than the ‘average’ college with similar stats score better than colleges that match or, worse, undershoot the mark.”