The culture of the older New England colleges can be traced back to their religious origins.
HYP were founded by Calvinist sects of the Protestant religion. Calvinists believed in predestination and the notion of “the elect”. “The elect” would go to heaven and all others would go to hell. The earthly indicators of “the elect” were social standing and monetary success (i.e. US News ranking)
Tufts was the only New England college founded by the Universalist sect. Universalists did not believe in “the elect” or hell. They believed that everyone was equal and everyone would go to heaven. (independent of US News ranking)
This means that the “HYP-level” colleges are actually the ones that “can’t do that” without dire consequences - because the resulting a drop in rank could land them in hell
@gallentjill for one year, no, but the trend will continue and that will be a problem on multiple fronts, as they say it is what it is and whether you like it or not, rankings have a real effect.
@jsparrow17
Schools in the vicinity of Tufts (NYU, UNC, U Rochester, W&M, etc.) are all clustered around Tufts within just 1-3 points differences. Considering that all top 30-35 schools’ admissions selectivity, professor:student ratio, graduation rates have been improving at unprecedented rates, Tufts consistently losing 3.25 points definitely jeopardizes its position in the top 30 this year or sooner. Once the ranking starts slipping, especially with its selectivity categorized as “more selective,” there will be the ripple effect on Tufts’ national & global reputation (peer assessment score, not to mention), causing a vicious cycle of the current ranking slip.
After all, Tufts shouldn’t be ranked at #29 (which is still really good), but it should be ranked around #18-#25 if Tufts paid attention to the ranking metrics. (Its endowment per student exceeds that of Georgetown & CMU; its admissions selectivity is on par with Notre Dame, Georgetown, Cornell, and Wash U; its student resources are on par with small leading liberal arts colleges; graduation rate? one of the highest…) I guess Forbes ranking more accurately captures Tufts’ peer institutions, ranging from U Notre Dame, Wash U, to Middlebury.
I honestly want all the rankings like the U.S. News to be gone away forever. However, I guess I will just stop paying attention to this at all for the reasons Mastadon brought up. After all, I was initially drawn into Tufts community for its unpretentiousness and quirky vibe that made me feel comfortable and engaged unlike other elite institutions.
It is my hope though that Tufts admissions office & administration would take necessary actions to correct the misleading U.S. News ranking. Maybe Tufts daily can write a piece about it? Maybe the current admissions dean can address the issue (I believe that the admissions dean directs the completion of the common data set)? Maybe President Monaco or the board of trustees can fix this?
I don’t know the answer, but hopefully, it gets resolved.
I called one more time and I was able to get confirmation (from one of the admissions officers I had contacted previously) that the entire admissions office, including the dean Karen Richardson, received my message and are working towards resolving the issue alongside the office of institutional research. I’ll let them be for now, hopefully they can submit the updated common data set to US News & World Report before the 2019 rankings are established.
Based on some forensics work, it appears that Karen may have inherited a “ticking time bomb” from Lee Coffin.
He appeared to be phasing out the top 10% metric because it was supplied by so few applicants (see quotes below).
The 2015-16 CDS (the first where the 10% data was omitted) was based on data from the class admitted in May of 2015.
Lee Coffin announced that he was leaving his position as dean of admissions at Tufts for a promotion to provost of enrollment at Dartmouth in Feb 2016 (Tufts had just surpassed Dartmouth in early admissions applicants, so Dartmouth had to do something
Karen became Tufts’ new dean of admissions in June 2016.
The impact of the omission did not show up until Fall of 2017 when the US News Ranking based on the 2015-16 CDS came out.
Wow… Lee Coffin seems like a very unprofessional guy… He must have known the consequences of not reporting the top 10% metric – I found many articles on Tufts Daily in which he talks about Tufts’ selectivity ranking (was ranked 15th) in US News – but chose to phase it out anyways and suddenly left Tufts. Meanwhile at Dartmouth, he submits all the data to the US News… If he truly believes in what he said, why is he continuing to report the average class rank & top 10% metrics at Dartmouth…??? When I looked at 2014-2015 CDS, the last year Tufts reported the metrics, almost 40% of incoming freshman did submit high school class rank.
Lee may not have known the implications, or the decision could have been made at a lower level (the name of the person filling out the CDS changes from the year before it was dropped to the year that it was dropped (Neither person is still at Tufts). At Dartmouth he may not be involved and they may just spend more time trying to optimize the rankings. Until recently, a group of posters from Dartmouth used to post negative messages on this board before the ED application date and after decisions were released. That is an example of how competitive they are…
“high school rank counts for 3.25 points of the entire ranking score”
So, how does it work? Even if a college has only, say, 10% of applicants who report rank, the gpa rank reported factors 100% into the class rank component of US News? If so, that would be a deceptive number.
Also, You are assuming in your posts that the missing ranking component would be favorable to Tufts, which may not be the case. The high schools reporting ranked may be super competitive so some of the kids admitted by Tufts from there may not be Top %.
@suzyQ7 According to the 2014-2015 common data set, which Tufts reported its last high school rank, 90% of students reported to be in top 10%, almost 100% within the top 25%. (40% of enrolled freshman reported the data).
Tufts was ranked 15th for admissions selectivity in 2016 and by not providing this metric, it plummeted to 57th.
And, yes, this metrics factor 100% into the class rank component, unfortunately, and this is evident Tufts admissions & administration did not know the consequence of not submitting the data.
@suzyQ7 not reporting the class rank is equivalent to not having any incoming students be in the top 10% of their class (which most are, in the case of Tufts).
Even for high schools that do not do ranking, when GCs fill out the LOR they would have to check off the box if the student was top 1%, 5%, 10%…in academic, rigor of courses, etc. You can see it on the form. Not sure why adcoms couldn’t use that to report.
If high school rank counts for a possible 3.25 points and Tufts didn’t report that statistic, then they received 0 out of 3.25 possible points. Reporting a “poor” statistic would only help their score, just not as much as a school with a higher 10% high school rank. But I don’t think anyone believes that their statistic is any less stellar than other colleges in its class.
That being said, this infatuation with these rankings is ridiculous. If anyone really cares whether the school they’re associated with drops two or three spots, they need to find a better use of their time and brain power.
I’m curious how other #30 universities are able to come up with that number as I suspect the population is homogeneous and just as many applicants at those schools come from High Schools that don’t report class rank. There has to be some method. To just say, no to the system doesn’t seem to be the right approach. Although to make up a number based on estimates also seems a system that could be highly biased in favor of the institution.
Our school doesn’t publish rank but, if the student asks the guidance counselor what their rank was/is for college or scholarship applications it is provided. I don’t know if it is ever officially supplied to the university by the school, but the students put it on their applications.
I’m not one to get rid of rankings. Without some way of reviewing the performance of the population, things like prestige end up how people rank schools. That often leads to complacency for those schools that have the prestige and other ways to get students for those who aren’t as prestigious. Now, maybe the rankings system should change, but with out a way to keep score it does take a certain level of transparency away from which schools are doing better. Additionally, as imperfect as USNWR is, the method that we’d come up with to do the ranking, would be much more flawed.
@gallentjill@Dawala282 Wouldn’t the same be true of GPA (as top 10%)? Student earning a 3.8 at an underfunded, inner city school is not the same as a student earning 3.8 at a highly competitive college prep private school. All of this is comparing apples to oranges. I hate the rankings generally but if they are going to persist I’m not sure where the line can be drawn (which is frankly why they are problematic in the first place).
@suzyq7 - US News is not very transparent. What they do reveal is that high school rank counts as 3.25% of the total ranking and that the output of the formulas is translated to a 100 point scale - hence the 3.25 points that is mentioned in some earlier posts.
The straight class rank number is used, which can be deceptive. I assume this is why Lee was phasing out its use in the Tufts literature. Colleges that enroll students from weak high schools will do better than those that enroll students from strong high schools. What is interesting is that it tends to be the more competitive high schools that don’t report class rank (for fear of it hurting their student’s admissions chances at colleges trying to optimize rankings), so selective colleges may have lower percentages of applicants that report class rank.
For context, Tufts reported 90-91% in the top 10% in the years leading up to the year that the metric was dropped which is reasonably high. Most recently, Wake Forest reported 77-78%, Georgetown reported 89-91% and Harvard reported 95%. It is hard to attach meaning to the statistic though, because there is no way to know the competitiveness of the respective high schools that the applicants attended.
What US News does not reveal is how the score for each category is normalized before they combine the categories together (i.e. how does a 100 point difference in SAT score compare to a 1 point difference in class rank ?) Standard statistical practices are not always used. Back in the 90’s there was a big uproar when it was discovered that they used a logarithmic multiplier for student spending in order to keep HYP at the top of the rankings.
@aegis400 It should also be noted that US News does not allow a school to drop out of the rankings. If data is not supplied, then they make up a value (that tends to be punitive) and rank the school based on the made up value (this is what happened to Reed when they stopped supplying data).
Technically, since we don’t know what data value US News supplied on behalf of Tufts, and we don’t know the procedure they use to normalize it, we cannot know for sure what the exact impact on the rankings is.
@Mastadon Tufts scored 75 on 2017 edition of the U.S. News with the top 10% metrics reported, and 72 for the 2018 edition without the top 10% metrics. Therefore, it seems clear that Tufts not reporting the top 10% metrics (3.25-points) precisely corresponds to the slip in the overall ranking & selectivity ranking. Given that the U.S. rounds up the raw score, if Tufts had 72.3+ raw score and had 3.25 points added up, it would have been ranked at #25 for the 2018 edition.
Also, it’s very unfortunate that a flawed source such as the U.S. News dominates today’s college admissions landscape.