http://time.com/money/best-colleges/rankings/best-small-colleges/
New ranking out by Money magazine. Based on their methodology, the rankings are quite different than other popular rankings
http://time.com/money/best-colleges/rankings/best-small-colleges/
New ranking out by Money magazine. Based on their methodology, the rankings are quite different than other popular rankings
They sell more magazines & get more eyeballs if they do that.
Money Magazine’s ratings are focused on what you would expect for a magazine with its name-student debt, graduation rates, post-graduation earnings, and other financial outcomes. They place much less weight than USNWR on measures of incoming freshmen.
Their weighting is as follows:
1/3 quality of education. Only 10% of this measure (or 3% overall) had to do with student test scores or college yield. Money doesn’t use peer rating. 10% (3% overall) is student/faculty ratio. A whopping 60% (24% of the overall) is tied to graduation rates. They don’t use the endowment at all, instead using a small “financial troubles” (15%/3%) category.
1/3 affordability. Average price, debt, loan default, low-income affordability.
1/3 outcomes. Lots of measures of earnings, including major-adjusted earnings ratings and a social mobility index.
Money also adjusts more than USNWR for expected outcomes; schools that perform better than average when it comes to Pell or nontraditional students will score better on the Money ratings.
http://time.com/money/5362601/how-money-ranks-best-colleges-2018/
The fact that this ranking varies quite a bit from USNWR illustrates a good point: Rankings in general are rather arbitrary and any one ranking does not necessarily use a criteria which matters at all to your child or to my child.
Graduation rates, affordability, and outcomes (like getting a job after graduation) to me seem like pretty sensible criteria.
They’re just numeric opinions from a people who can form an opinion. Anyone can do that. The only difference is that they have the money to publish it in the media.
I think it’s a reasonable list. Mass maritime so high doesn’t fit the overall mold . Probably a result of getting the high paying jobs at sea.
The rest is good. Some of the cc posters won’t be too pleased but it’s just another list. No better no worse.
As with any ranking, one must focus on the underlying criteria used.
Quality of education, affordability and outcomes. Those are the top qualities we looked for in colleges for our kid. I like this list.
Williams College is high on the list, once again. For some reason, ranking organizations seem to like it; it does well on so many lists.
Too much weight placed on the graduation rates I think, otherwise reasonable.
Williams does a lot of things right. They have high admit stats, tons of money, provide excellent FA, and admit a lot of minority and first get students.
It’s also interesting to see the schools that land a lot higher or lower on this list than in the USNWR rankings. Bentley, College of St. Benedict, SUNY Envt. Sci and Forestry make the top 50 list. Middlebury (USNWR #6) and Carleton (USNWR #8) do not. Schools like Holy Cross, Bates and Union place much higher on the Money ranking than USNWR.
@Sue22 - Yes, I love seeing Holy Cross come in at #7. They do a lot of things right for their students.
SUNY ESF is a gem. For a student interested in environmental science, it should be a must look at school.
Well, the top LAC, Washington and Lee, came in 24th by overall rank, with quite a few others in the top 100. That’s . . . OK. Nonetheless, some liberal arts colleges did overperform nicely when compared to other rankings, notably Holy Cross, Union and St John’s (MN).
Based on the linked page, half of the graduation rate input is raw 6-year graduation rate (which is mostly a correlate of admission selectivity and other student characteristics), while the other half is what they call value-added graduation rate, or how the graduation rate compares relative to that expected from student characteristics (similar to what USNWR calls graduation rate performance, which USNWR weights much less than raw graduation and retention rates).
It looks like they use payscale.com for most of the earnings outcomes inputs, but do include use of outcomes adjusted for mix of majors and student characteristics as well as raw earnings outcomes. They also use College Scorecard for some of the earnings outcomes inputs.
Greater weight to adjusted graduation rates and earnings outcomes makes sense if the goal is to find if the college may have a treatment effect here, rather than just a selection (of students and their majors) effect. However, using raw earnings outcomes still gives some boost to engineering-heavy schools, and using raw graduation rates and peer quality still gives some boost to more selective schools.
Simmons, with a 74% 6 yr grad rate still makes this list. If grad rate is 60% of the rank, hard to believe that a school with 74% grad rate makes this ranking.
This list is pretty bad. I mean, rankings are all flawed, but they’re not all equally flawed. Publications be gettin desperate.
Yeah, even as much as I get that all rankings are flawed, that’s a pretty weird list.
Outcomes by college is a horrible metric and can lead to false conclusions. Outcomes by major at a college is a realistic metric.