Tufts not reporting top 10% of high school class/US News decline

@brianboiler -

One could challenge this line of reasoning…

The US News ranking started out (back in the 80’s), as pure academic peer ranking (i.e. as a measure of prestige within academia). Data was introduced after a couple years (in order to make it appear to be more objective). The problem is that the check for the accuracy of the formulas was whether or not HYP ended up at the top. So, one could argue that US News is actually perpetuating the use of prestige as a ranking methodology, not eliminating it, and it is doing so in a non-transparent fashion.

The interesting paradox associated with the US News ranking is that confirmation bias is arguably its biggest technical weakness, but also its primary reason for success. For research universities, it is really a measure of how closely another school matches the model established by HYP - irrespective of the absolute goodness of that model for undergraduate education.

Food for thought:

If US News accurately represents quality of undergraduate education, then why do research universities need to be separated from LACs?

@Mastadon I agree it isn’t perfect, but I do think it is the best vehicle we have. And maybe not even for the reason it is intended.

Like it or not, a large subset of students/parents/other stakeholders do use it as a data point in their decision making process. Students and parents for obvious reasons, but other stakeholders who make decisions on research and donations. Prospective faculty will use this information to help justify leaving one university for another. Granted that each subset uses that data to make decisions to support or, more often, not support a university.

The Universities (with the lone exception of the instances mentioned in this thread) themselves have validated the ranking system by modifying their processes to move up and down in those rankings. Every Director/VP/Dean of Admissions understands the flaws you mentioned in the calibration method, and have decided they collectively have little problem with it. As long as the three schools continue to calibrate each other (which you’d have to admit is better than only one standard) then your model is directionally correct. Now, we may begin to see other schools creep in there as well, maybe this year if things continue on the trajectory they have been for a few years.

I think we as consumers need to realize there is literally no difference in the level of education you can get at any institution between 1-15 or 1-20 in the USNWR ranking. Similarly, not much between 21-100 probably. Before I get all sort of hate from lovers of that second group, I am giving a broad statement, not a precise one and as you can tell from my handle, I’m a firm supporter of a member of that group. The point is that these are all great schools and a difference of 1 or 10 places isn’t that different in the eyes of the student.

Without it, things like student:faculty ratio and graduation rates and earning at graduation and whatever other factor weighs in will have little or no check. Schools in the state of Illinois system will begin to decrease faculty and add to admissions to pay the bills.

Few will argue effectively that the US Higher Education system dominates the world market. I suggest that this ranking helps that happen.

Now to your question as to why do Research Universities and LAC score differently? Can you really expect to compare Williams with the University of Michigan? Bowdoin with UChicago? (although I often wonder why the college at UChicago isn’t stacked up against the rest of the LACs).

my biggest problem with the rankings is the ranking numbers. The ranking should be done in bands. The difference between a 23 and a 26 rank on the US News list could be just one point. Because of ties… If there are 3 schools at 23, the next school is ranked 26.

The other problem I have is the self-fulfilling prophecy of the rankings. The guidance counselor rankings play a part, and the more competitive the school is to get into, the better the school is perceived by the counselor and student. Thus they get a better score from the counselor the next year, therefore increasing in the rankings.

All had said I think you folks care more about the rankings than Tufts does. I don’t think they’ve ever been very concerned about it to be honest. They turn away 4.0, 36 act kids in favor of lower “ranked” students all the time. If they cared about the ranking game, they wouldn’t do that. It hasn’t really changed the perceived quality of the school. “Tufts syndrome” wasn’t invented in a vacuum. Boston is hot hot hot. I’m guessing they don’t care about a ranking of three points below then what they could be. Good for them.

@aegis400 -

https://www.usnews.com/education/best-colleges/articles/how-us-news-calculated-the-rankings

We are dealing with a multi variable calculation with a “weighting” step, as well as “normalization” and “transformation” steps that are not specified. To create the “transformation” to a 100 point scale, I believe that each school’s score is expressed as a percentage of the highest score. Since the highest score can change from year to year, an individual school’s score can also change from year to year- even if none of its underlying indicators have changed.

This means that we need to be careful when drawing conclusions based on year-to-year comparisons of any particular school’s scores based on the change of a single variable - because other variables and the scale itself could have changed.

Any updates on this? The rankings are clearly bs but they matter nonetheless :confused:

@Earwicker I decided to not contact them for a while since they made it very clear that they got the memo. That said, Tufts is historically apathetic towards rankings, so I’m not sure if this is all going to work.

@ucfdefere “Tufts is historically apathetic towards rankings?” Where did you hear that? Tufts Syndrome is called Tufts Syndrome because that’s the school that started it. Messing around with yield protection to help their odds has been so detrimental to the stress levels of high school seniors that schools that do this clearly care about boosting their rankings than the future college students themselves. IT’s one of the worst aspects of the modern-day college admissions process.

@JenJenJenJen Tufts hasn’t yield protected since the 90’s. I have a friend who got into all of HYPSM as well as Northwestern and Tufts.

I also don’t exactly see how a school protecting itself from not having enough yield to fill a class is a bad thing necessarily.

Yup, I have a son from the high school class of 2011 who got into brown, uchicago, washu, UPenn, and tufts. I think tufts has been using yield protection only in the way that the top 20-30 schools have been using it…to make sure they have the optimal class size. It and its peers are now too selective to care about “tufts syndrome.”

Could this explain why Tufts is now ranked on US News as “MORE Selective,” when it used to be ranked as “MOST Selective”? At #29 on US News, Tufts is reported as accepting 16% of applicants. Yet NYU, ranked at #30, accepts twice as many applications – 32% – yet is listed as MOST Selective.

What is going on at US News that they would rank selectivity like this? It makes no sense.

@AnonymousOutWest
Yes. Which is why even though some would say it is impossible to point out which factor specifically led to Tufts’ decrease in rankings this year, the fact that Tufts did not report class rank this year and is listed as “MORE selective” while schools with 2x acceptance rate such as BC and NYU are listed as “MOST selective points” out that selectivity almost definitely played a role in decreasing Tufts’ rank. From this, it is VERY LIKELY that not reporting class rank is the ultimate factor for why Tufts decreased in overall rankings.

@Dawala282
Thanks, and I agree that this is the likeliest explanation.

How frustrating that Tufts doesn’t seem to care about correcting this unnecessary and damaging error caused by simple lack of reporting. As an alum who makes annual contributions, I’m extremely disappointed.

@AnonymousOutWest Tufts has a 98/99 selectivity rating on the Princeton Review, though

Has anyone heard back from Tufts?
It seems so nonsensical that a world-class university such as Tufts can’t even fix this simple mistake for two years.

@aegis400 I’ll call again tomorrow.

Update (the following is from Tufts’ 2017-18 Common Data Set: http://provost.tufts.edu/institutionalresearch/files/CDS_2017-2018.pdf)

With the publication of this year’s Common Data set, Tufts University resumes the practice of posting information related to the proportion of enrolled first-year students who graduated in the top 10 percent of their high school classes (“Top 10%”).

Good that they’re reporting it, but strange that the top 10% is relatively low compared to its peer schools. But, at the same time, most of my friends were top 10% in high school but didn’t actually end up reporting their class rank.

@aegis400 amazing news! We did it!!!

@Tufts2021 it doesn’t really matter if it’s relatively low, in my eyes –only 26% of incoming freshmen reported class rank, and even if it’s only 80% of those 26% that were in the top 10% of their class, it still gives Tufts a higher score on US News & World Report.