US NEWS Ranking, A few surprises

“ I really think people need to pay attention to their own/their child’s mental health instead of chasing prestige and ranking. Happiness is priceless.”

I couldn’t agree more

@milee30
That’s not an accurate conclusion of the data. Maybe at U of C grades are too low for students to transfer to a school that is just as reputable, so they stay as maybe prestige is the most important criteria to u of C students.

" I really think people need to pay attention to their own/their child’s mental health instead of chasing prestige and ranking. Happiness is priceless."

@nrtlax33 …while no school is for everyone and some people will find issues with any of them, I totally agree that not blindly chasing prestige/ranking instead of what really matters while looking out for your child’s mental health and happiness are most important.

@emorynavy Huh?

That is not how I was taught to apply Ockham’s razor where usually the simplest answer is what is actually happening. I think it is much simpler to look at both high freshman retention rate and high graduation rate as students actually like it there. That is much more plausible then a conspiracy theory about they care more about prestige.

It can not be simply rejected out of hand like you did. That is how, when rating services use the number, rating services interpret it.

“Maybe at U of C grades are too low for students to transfer to a school that is just as reputable, so they stay as maybe prestige is the most important criteria to u of C students.”

Sure, we can assume that a group of students whose general qualifications are at the same level or greater than Stanford, MIT, Harvard, etc. are just too dumb as a whole to select a college that fits them and then are too dumb to be able to get acceptable grades at college and also too dumb to figure out any acceptable alternatives to the Hell Hole they’re stuck in.

Or… there are 1700 or so students out of the millions who graduate HS each year that find they are a reasonable fit and that UChicago offers them a great education. This is not so difficult to picture. No college is a fit for every person so 100% satisfaction and blind love is just never going to happen anywhere. But it’s not hard to picture that even the more extreme colleges - such as UChicago - can attract and retain a couple of thousand of top students who seek and enjoy the challenge.

In a world with so many varied tastes and preferences is it really so hard for people to understand that something that doesn’t appeal to them may have great appeal to someone else?

deleted sent in a PM

Usnews sent me this email.
Thank you for writing U.S. News.

We are trying to provide the most accurate information we can. Often this is dependent on the schools we survey.

The following is a link to articles in which we address some recent concerns, as well as change in the way we do our rankings.

https://www.usnews.com/education/blogs/college-rankings-blog

The Best Colleges rankings were never intended as the sole source of information about colleges. Rather, a beginning point for parents and students planning to go to college.

Regards,
Richard Hare
Customer Relations

@tk21769 I think we really need to take the IPEDS Instruction spending with a grain of salt (and ranking services should as well). If you look at some other schools that should probably be comparable, you will see they vary widely. Harvard and Yale have similar tuition levels to Chicago, larger endowments, and higher research spending. Harvard lists instruction at $49K per student in IPEDS, Chicago at $88K, and Yale at $111K. If you look at all IPEDS categories combined (Instruction, research, academic support, etc.), Yale is $232K per FTE, Harvard is $162K, and Chicago is $144K. I would say it is likely that they have differences in the way that they report that is the primary driver of those wide disparities.

@IzzoOne It’s impossible to spend more than $40K per student. Do you think if an university build a new building, then every student will use it?

Anyone want to speculate why Fordam dropped so much in the rankings?

<yes, suzyq="" is="" trying="" to="" change="" the="" subject="" from="" uc="" and="" uofc,!=""></yes,>

@suzyQ7

I’ll give you a hint. “It’s Fake News”. Sorry I couldn’t resist.

@SuzyQ7-

US News made several changes to their formula this year.

1.Pell student graduation rate was added with a weight of 2.5%
2.Pell student relative graduation rate was added with a weight of 2.5%

To accommodate the additions, the following weights were dropped:
1.High school guidance councilor survey weight dropped by 2.5%
2.SAT scores weight dropped by .375%
3.Top 10% of class weight dropped by .875%
4.Acceptance rate weight dropped by 1.25% (it is now 0%)

So, Fordham’s normalized score dropped due to the combined effects of four possible reasons:
1.It did not do as well as the schools around it in the new Pell related metrics
2.It did better than the schools around it in the metrics whose weights were reduced
3.Statistical noise
4.The value of a metric changed by a statistically significant amount (the least likely explanation)

If one had a paid subscription to USNews, one could look at the scores for each metric and figure it out.

Note that a small change in normalized score can result in a large change in rank if the scores of the schools around it are equal or very close.

https://www.usnews.com/education/blogs/college-rankings-blog/articles/2018-09-10/whats-new-in-the-2019-us-news-best-colleges-rankings

@IzzoOne

AFAIK, none of the major rankings use those IPEDS numbers directly. USNWR uses related (but more specific) indictors such as class size and faculty compensation, which presumably drive up the IPEDS instructional spending number (ceteris paribus). UChicago does seem to have small class sizes and high faculty salaries even compared to some peer institutions.
https://www.usnews.com/education/best-colleges/articles/ranking-criteria-and-weights

I don’t disagree with your point, though. It isn’t entirely clear (to me at least) what the spending figures include; I’m not 100% confident that every college is calculating their numbers the same way. When I referred to instructional spending at the bottom of post #465, it was to reinforce my claim that UChicago seems to have a rather strong focus on undergraduate teaching (which was my personal experience as a student there, albeit many years ago). If others don’t trust the numbers I’m citing (or the ones USNWR uses), fine. See for yourself by visiting classes and talking to current undergrads there. Or look elsewhere (in particular at LACs, if you want consistently strong undergraduate focus.)

WSJ/THE uses IPEDS numbers directly.
The “finance per student” metric in Wall Street Journal/Times Higher Education College Rankings 2019 methodology (https://www.timeshighereducation.com/wall-street-journal-times-higher-education-college-rankings-2019-methodology?mod=article_inline) uses “instructional expenses” and “students services” from IPEDS. The variables are from the IPEDS files F1516F1A and F1516F2; the variables are F1C011/ F1C061 and F2E011/F2E051.

Total finance variable (numerator) = instructional expenses + students services

FTE (denominator) = FTE graduate + FTE undergraduate students
FTE variables are from the IPEDS file EFIA2016; the variables are FTEUG and FTEGD.

All files are the provisional files i.e. latest avaiable release as of April 2018.
They also apply an RPP (from the Bureau of Economic Analysis) on this variable before normalizing it across all ranked institutions.

I hope this clarifies the transparency of WSJ/THE’s methodology as opposed to USNWR’s “methodology”.

You won’t get much out of U.S. News College Compass. See https://www.usnews.com/education/best-colleges/articles/college-compass-faq
Each school sent their own interpretation of data and none of the critical data is made available for inspection. Just remember, about 2/3 of their “surveys” have been thrown out by their “experts”.

Each person only has one experience from one college. All other information are secondhand. If someone likes their own college, it only means the college is a good fit for the person. Nobody knows what else is out there. My brother-in-law used to fly United airline. Once he used code-sharing to book one leg of ANA (All Nippon Airways) flight and since then he refused to book United’s flights despite the fact ANA’s flights usually cost a couple hundred dollars more. Totally different experience. The difference is おもてなし(omotenashi) (http://www.romajidesu.com/dictionary/meaning-of-omotenashi.html)
Read https://guide.michelin.com/sg/features/omotenashi/news

@Mastadon pointed out that “4.Acceptance rate weight dropped by 1.25% (it is now 0%).”

Is yield still part of the USNWR rubric and what weight is it given?

@nrtlax33
Are there any methodology differences between 2018 and 2019 versions of WSJ/THE rankings? If so, can one see any trends as to which colleges are moving up or down significantly?

Yield is not part of usnwr rubric. It was indirectly linked to acceptance rate

@ccdad99 : I think WSJ/THE learned from one of the biggest criticisms of USNWR’s methodolgy – no data smoothing. They are using data from multiple years or cumulatively in many places. For example, student surveys are from two years and Value-added salary//Value-added loan default rate data are from three consecutive years.

Remember, this is the third year they are doing this. It takes some time to smooth the data out. You might see a brand new product on Amazon with a hundred 5-star fake ratings, but 6 months later you might see hundreds of 1-star complaints from real buyers. Read https://www.wsj.com/articles/how-sellers-trick-amazon-to-boost-sales-1532750493 In fact, this “news” is nothing new outside US, it is a “news” only in America.

@nrtlax33
I get your point about fluctuations in a ranking system only 3 years old. However, using data from multiple years is also a feature of USNWR rankings. Obviously, it makes sense for both.

@nrtlax33
I stand corrected. So WSJ/THE does use some of the IPEDS spending numbers.

Although, just from comparing these two pages …
https://www.usnews.com/education/best-colleges/articles/ranking-criteria-and-weights
https://www.timeshighereducation.com/wall-street-journal-times-higher-education-college-rankings-2019-methodology?mod=article_inline
… I don’t conclude that WSJ/THE is far more transparent than USNWR. Maybe others can see something I’m missing. My experience on CC has been that for just about any college performance metric one cites (whether it’s from the CDS, IPEDS, NSF, Payscale, wherever) one is likely to prompt debate about confounding factors, “gaming”, sample sizes, definitions, reporting instructions, etc.

Doesn’t each school furnish its own IPEDS data, too?. As we’ve already seen up-thread, the IPEDS data doesn’t seem to be far more above reproach and scrutiny than the USNWR/CDS data. It’s easy to cry “fake news” about any report one doesn’t happen to like. Where are the better, truer numbers? Who should be doing the inspection?