They each have their own flaws as well (ie, based on data from students receiving financial aid only, or skewed towards STEM and medical focused unis whose graduates have higher earnings)
@TomSrOfBoston I am more inclined to rely on expert opinions. Here is one example on how data is misleading or plain wrong. How can Colby have a 94% 6 year graduation rate when its first year retention is 93%. This has a pretty significant impact on overall ranking and it is obviously an error.
@suzyQ7 all of the UCs are top research universities with the exceptions of Santa Cruz, which is very unique and more of a Reed than a UCLA culturally, and Riverside. UCSD is one of the best schools in the country, they’re frequently near the top of grad school rankings (which are much more faculty-focused). If anything, the UCs are pretty underrated here.
For some historical perspective, US News best colleges became an annual feature in 1988, that was the first time they expanded it to top 25. It looked as follows.
National Universities
(1)Stanford
(2)Harvard
(3)Yale
(4)Princeton
(5)UC Berkeley
(6)Dartmouth
(7)Duke
(8)U Chicago
(8)U Michigan
(10)Brown
(11)Cornell
(11)MIT
(11)UNC Chapel Hill
(14)Rice
(15)UVA
(16)Johns Hopkins
(17)Northwestern
(18)Columbia
(19)U Penn
(20)U Illinois
(21)Caltech
(22)William and Mary
(23)U Wisconsin
(23)Washington U St Louis
(25)Emory
(25)U Texas
national liberal arts colleges
(1)Williams
(2)Swarthmore
(3)Carleton
(4)Amherst
(5)Oberlin
(6)Pomona
(6)Wesleyan
(8)Wellesley
(9)Haverford
(10)Grinnell
(11)Bryn Mawr
(12)Bowdoin
(12)Reed
(14)Smith
(15)Davidson
(16)Earlham
(17)Middlebury
(17)Mount Holyoke
(19)St John’s (MD)
(20)Colorado Coll
(20)St Olaf
(22)Centre
(23)Claremont McKenna
(24)Vassar
(25)Hamilton
(25)Washington and Lee
@phoenix1616 That Boston Magazine article has been cited here dozens of times. it is actually a very positive article about Northeastern despite the click bait title.
@merc81 It is definitely an error. No way the 6 year graduation is that high. The 4 year is wrong too. It conflicts with Forbes 2017 data and is well out of line with recent historical data. Interesting the common data set is not available past 2014.
Exactly. It is absurd to me that universities can easily just hide a portion of their student body so their performance does not count. Not only is it gaming the system, but it sends the wrong message to their students. If you weren’t a first-time fall student, we don’t care about you or your progress. The other students are “real” students who matter and represent the university, you are not. It seems common sense that everyone should be counted.
Waitlisting students also games the system. I’m wondering how waitlisting compares to spring admission as a gaming strategy – especially because Cal does both yet should be ranked higher (in my opinion, of course).
@Sapper119 : The six-year rate is for students entering from fall 2006 through fall 2009. The freshman retention rate is for those entering from 2011-2014. However, I’m not saying an error has not been made in the example you cited.
I should point out that there are reasons, other than “hiding” students for enrolling “at risk” students in the summer. Especially at public universities.
Usually it’s part of a retention and transition program for students that have been identified as “at risk”, such as URM’s and low SES students. The students are enrolled in the summer and go through a program that helps transition them to college.
Sometimes you enroll folks in the summer or spring, due to limited resources (like housing, faculty or facilities) available in the Fall. Universities tend to be limited by the number of freshman they can enroll and support. That seems to be a key limiting factor in overall enrollment.
Is this being done to game the system? In some cases, I’m sure it is. However, it’s not always driven by ratings.
@insanedreamer yes the Economist, Money and Forbes have all done rankings based on outcomes, value, etc.
How does one reconcile when the input rank for a school may be in the top 40 or 50, but their outcome rank is north of 200, 300 or 700 in the other three rankings?
Wouldn’t it be cool if there was a tool where you had all the rankings on one website and you could manipulate the raw data by assigning weights to the various categories and make a ranking that illustrated what was important for your student?
@doschicos H’s Z-list offers are very small, only about 25-50 IIRC; to compare it to NUin (~15% of the incoming class?) is incorrect. As you say, lots of schools do this, but Harvard is not one of them, and very few do it to the extent that NEU does.
@RightCoaster "@Much2learn I do actually think Northeastern is a better school than all the schools you compared it to. Very good education, the benefits of co/op learning, and excellent location. Tons of job prospects coming out of school.
I never went there, and don’t have a kid that attends. I just think it is a very good value for what you get…, a graduate with an actual job."
Northeastern is an excellent school, but I don’t agree that Northeastern is a better school than the others as a broad statement. It depends on a lot of details.
We just finished D2’s college search. She decided to apply to, and was admitted to, 6 of these schools: Lehigh, Wisconsin, Northeastern, Case Western, Northeastern, and Purdue. They are all high quality, and attract many students in the 29-34 ACT range. Each of these schools will have no difficulty explaining why their school is the most exceptional school in the group, and for a certain student, I think they are all the best choice for someone. I encouraged her to forget the rankings and focus on sorting out the strengths and weaknesses of each school, and explaining which one was best for her.