US NEWS Ranking, A few surprises

^ Yes, as I stated above, “I do think family income, selectivity, rigor, and prestige all influence graduation rates.” Multiple factors must be influencing the graduation rates. St. Olaf isn’t more selective than Michigan-AA. I don’t think Beloit is more selective than UT-Austin. UChicago is not clearly more selective than Stanford, which seems to have a much higher median family income ($167,500) and also a much lower 4 year graduation rate (75%). Granted, we’d need to look at patterns across a lot more data than these few ad hoc examples.

FWIW, here is the above table with ACT scores added:
Median Income … 4Y GR … ACT … School

$30,000 … 49% … 24 … Berea College (LAC)
$58,000 … 25% … 25 … University of Houston

$58,700 … 69% … 22 … Spelman College (LAC)

$75,300 … 32% … 24 … Liberty University

$106,600 … 59% … 25 … Drew University (LAC)

$115,400 … 52% … 26 … Michigan State

$110,900 … 75% … 27 … Beloit College (LAC)
$123,900 … 58% … 29 … UT Austin

$140,400 … 85% … 29 … St. Olaf College (LAC)

$154,000 … 76% … 31 … Michigan AA

$172,400 … 89% … 31 … Carleton College (LAC)

$180,700 … 73% … 30 … Tulane

$134,500 … 88% … 34 … UChicago

In 5 out of 6 pairs, family income and test scores together fail to predict the higher graduation rate.

ACT score source: prepscholar

USNews stopped publishing the print magazine in 2010. They now sell access to more detailed college data (Compass), and an annual Best Colleges book, and a few other books.

https://www.usnews.com/products/features/education-products-best-colleges

Yes, ad hoc examples can easily be cherry picked.

For example, Eastern Connecticut State (LAC) has a median family income of $102,000 and 22 ACT, but its 41% 4 year graduation rate compares unfavorably to that of Spelman.

I admit I haven’t read everything closely since I last posted as I’m on deadline for a work project, but hopefully this is informative to the conversation – or at least eye-opening.

I want to elaborate on why income in aggregate is a huge predictor of academic success – both in high school and college. Saying in aggregate is key – of course there are many examples of low-income students with great levels of academic success. I’m also using the term academic success here very carefully. Specifically, I’m not saying family income is correlated with intelligence.

Poverty is an endemic condition that leads IN AGGREGATE to many poorer outcomes for young people, with education and health being at the top of the list. For education, the poor outcomes derive from access to lower quality early childhood and K12 education, lower non-cognitive skills development, food insecurity and other adverse childhood experiences (policy speak for stress) that literally neurologically wires a kid’s brain differently, etc.

When a college serves more students from low-income families they are less likely to graduate for many reasons – they are less prepared, they have to work 30 to 40 hours a week, they cannot afford textbooks, they cannot pay for a car repair to get to campus, they are hungry, and the list goes on and on. Any or all of these occurences lead to students dropping out.

So, going back to my initial post, counting overall graduation rates means colleges that serve low-income students will perform poorly even if their ‘value add’ could be greater than other colleges that serve more affluent students.

Not saying these lower ranked colleges always have greater ‘value add’ – just saying if they do it’s not being well captured, though with the jumps by Georgia State and UMBC I think there is a little movement in the right direction. But it does feel like USNWR will only tweak it so far – making sure the elites always stay on top! :slight_smile:

Oh, and in terms of those comparisons between income and grad rates, they seem rather random and chosen to present the story intended; I imagine you could easily pick other pairings and tell a different story.

But regardless, only one of those colleges had students from the bottom half of average US household income – Berea. (Spelman is right at average) so none of them serve the students I was focusing on in post above. Once you get above a certain income level, the relationship between graduation rate and family income is certainly less important – so I agree with you in that regard.

Also, in these data it’s going to be the median income that matters more than average…but now I’m getting too wonky!

So how should it be captured?
If low graduation rates at some colleges are due largely to high concentrations of low income students (or some other factor unrelated to college quality), what evidence are we missing that some of these schools have academic programs as strong as the ones at much higher ranked colleges? I think the burden of proof is to show not only that USNWR (or other rankings) could be missing colleges with high added value, but that they actually are missing those schools. What is the good, missed evidence of undergraduate academic strength that does not telegraph income or admission selectivity? Can somebody make a case that, say, 5 or 6 colleges with sub-50% graduation rates are worth choosing (for a random liberal arts major) over schools with 80%+ rates, at equal net cost?

As I stated, we’d certainly need to look at many more examples. My hypothesis is that the relevant factors in predicting high graduation rates include qualities of the academic and social environment (e.g. mentoring, student faculty engagement, or school size). LACs seem to have a good reputation for strong undergraduate focus and student support. The pairs I picked focused on LACs that are not (other than Mudd) especially selective or prestigious. I didn’t have to work too hard to find examples of small undergraduate-focused schools that have higher graduation rates than schools with higher median income and scores.

The income figures I cited represent median family income, according to the NYT source.

@OHMomof2 The point is the same… USN&WR is in the business of selling content. Unfortunately, people consume that content, so they produce more of it. It in no way makes them “experts!” I could publish a site of “best party schools,” does that make me an expert in this subject? It’s unlikely that I have the physical ability to “party” at every school on my list (especially given my age). It’s also unlikely that I have the resources to pay a large enough staff of people to “party” at every school on my list. Plus, does everyone on my staff exactly concur as to what makes one school “better” than another in this metric? Sure, I would have no problem monetizing this content, because people are easily attracted to click-bait. Same idea. Different emphasis. See: “The Princeton Review’s List of the Best Party Schools” (formerly in printed book form). Experts? No. Junk science aggregators? Yes.

“UChicago is not clearly more selective than Stanford, which seems to have a much higher median family income ($167,500) and also a much lower 4 year graduation rate (75%).”

Once again, for the the umpteenth time, universities that have a larger percentage of students enrolled in Engineering will also have a lower 4 year graduation rate. Chicago doesn’t offer Engineeting. Stanford and and Michigan do. In fact, Michigan has almost 1/4 of its undergraduates in Engineering. Engineering often requires at least an extra semester beyond four years to graduate. That is why 6 year graduation rates are a much better indicator of academic achievement.

@tk21769 – Ah, I just had a small epiphany. I think we’re looking at this from two different lenses and therefore ‘talking’ past each other. You’ are interested in overall quality while I’m interested in outcomes. And, granted, USNWR is focused on purported quality as well.

Given my lens of looking at which colleges are producing the best outcomes (by which I mean performing at or above their weight along defined measures), if I were queen for the day and designing my own ratings, I’d only count subgroup outcomes. I’d report on outcomes such as graduation, entrance to grad school, wages 5 years after completion, and maybe even throw in some civic engagement measures like voting and home ownership. But again – only making comparisons by subgroups (You’d also have to figure out how to control for size – if a college is only serving 10 students in a group versus 1000 that would make comparisons problematic).

This would provide insights on which colleges are giving the best outcomes for students while controlling for differences in who they serve. But does that means they have highest overall quality, as in “my student is getting access to the very best education possible?” Probably not. But it would provide helpful information to students in making college choices by giving them valuable outcomes information. “If I can only afford an in-state public, which one is serving students like me the best?”

The best work I’ve seen along these lines is by Raj Chetty at Harvard who has figured out the social mobility index of colleges, lifting up colleges such as the CUNYs and SUNY Stonybrook as strong colleges that propel low-income students into at least the middle class. Here’s a link for anyone interested:

https://www.nytimes.com/interactive/projects/college-mobility/

^ Fair enough.

As I see it, the primary mission of colleges and universities is to discover and share knowledge.
That is the major “good” they do, in my opinion. I don’t care too much about outcomes like earnings, home ownership, or voting as indicators of college academic quality. However, I understand there is a long tradition of expecting colleges to encourage service and social mobility. If that is the perspective you want, then you’ll probably prefer to look at different indicators than I do.

@rjkofnovi I get your point about 6 year graduation rate and engineering, but 6 years is too damn long for most majors and is really too long even for engineering students.

I’m not convinced the relationship is so clear-cut.

4y Grad Rate … % Engineering Majors … College
89% … 26.6% … Princeton

88% … 16.2%? … Columbia*
88% … 21.4%? … JHU*
86% … 38.5% … Harvey Mudd
85% … 36% … MIT
85% … 16.9% … Cornell
83% … 19% … Rice
79% … 36% … Caltech
79% … 14% … Union College
77% … 17.1% … Michigan
77% … 10% … University of Southern California
75% … 20% … Stanford

So, at least a few colleges have many engineering majors yet also manage to get high 4 year graduation rates. Moreover, some colleges that don’t offer engineering (but also have high 4y graduation rates) may be at least as rigorous (on average, across majors) as some colleges that do offer engineering (but don’t have very high 4y graduation rates.) However, there may be cases where the percentage of engineering majors … perhaps combined with other factors related to graduation requirements, student support services, etc. … do drive down the 4 year rates.

39% … 65% … Georgia Tech ($130K median family income, 32 average ACT)

Graduation rate source: USNWR
Eng. majors source: CDS files, section J (* except for Columbia & JHU, which are derived from IPEDS data)

There is a substantial difference between an engineering program at Princeton/Columbia and one at a large public research university. Princeton’s program is designed as a 4 year program, with the students usually only taking summer internships. Michigan on the other hand, while it’s program was also designed as “4.0” years, it’s much more of a challenge to complete (based on students dropping classes, lack of critical classes/class time conflicts, etc.) There is an advantage to attending an elite (well funded) private program. Students at Public universities are also much more likely to take co-ops or even Spring/Fall internships.

With that being said, USNews uses 6 year graduation rates, and Co-ops have little impact on these rates. This can been seen when comparing 6 year rates to 8 year rates. 8 Year rates are usually only 1% or 2% higher. (College Navigator is usefully for quickly comparing 6 year rates to 8 year rates).

In other words, engineering has little impact on 6 year rates. It “could” impact 4 year rates, especially at public universities (and some privates).

At Georgia Tech, for example, the average time to complete a BS is 4.5 years (before considering co-ops). Tech may say it’s program is designed for “4” years, but most students have to take longer (the struggle is real).

http://profiles.asee.org/profiles/7821/screen/23?school_name=Georgia+Institute+of+Technology

GT’s 6 year graduation rate is 85%, while it’s 8 year rate is 87%.

https://nces.ed.gov/collegenavigator/?q=Georgia+tech&s=all&id=139755#retgrad

A few more things on the UM-Princeton comparison, computer science is part of Princeton’s engineering dept, it’s a separate dept at UM and is 8.6% of the majors, so to be fair, you’d have to add the 8.6 to 17.7, giving UM 26.3%, pretty the same as Princeton.

Next, you have to dig into what Princeton considers engineering:

Operations research
Entrepreneurship
Teacher preparation
Science and Technology
Sustainable energy

Most of these are actually certificates, not degrees, to be fair to Princeton. Perusing UM’s engineering majors, most, if not all were actually, you know, engineering :slight_smile:

@rjkofnovi I get your point about 6 year graduation rate and engineering, but 6 years is too damn long for most majors and is really too long even for engineering students.”

I don’t disagree with your statement Izzo, it’s just that USNWR doesn’t offer 5 year graduation rate stats.

N-year graduation rates are misleading when it comes to colleges where students take terms off school for co-ops or other reasons. 8-semester or 12-quarter graduation rates may be more useful in this case (and perhaps more relevant, in that semesters or quarters in school are those where tuition bills need to be paid).

UCLA notes that 86% of those who entered as frosh and graduated in 2016-2017 registered for 12 or fewer quarters ( https://www.apb.ucla.edu/campus-statistics/graduation-ttd ). For comparison, the recent four year graduation rates at UCLA have been around 71%, suggesting that there is a significant population of students who graduate “late” but without needing more quarters of school than the nominal number (12).

As of the fall of 2017, the latest info I could find, Michigan had 6,442 students enrolled in Engineering out of a total enrollment of 29,821 undergraduates on the Ann Arbor campus. That represented a figure of 21.6% of the student body enrolled in Engineering.

I had overlooked the fact that USN uses 6y rates.
I used the 4 year rate because that is what USNWR (and also Kiplinger) displays
Example:
https://www.usnews.com/best-colleges/princeton-university-2627
(The USNWR 6 year rates may be available behind a paywall or somewhere else I missed.)
We may want to consider whether the 6y rates are likely to be less susceptible, or more so, to some of the issues raised here (e.g. telegraphing admission selectivity, family income, or program differences).

The 17.1% figure I used for UMichigan’s engineering majors (post #611) comes from the 2017-18 Common Data Set. AFAIK, that’s the most authoritative source we have for that number.

https://nces.ed.gov/collegenavigator/?s=all&id=170976#programs has more detailed information, which matches the 17.1% (1,209 out of 7,059).

There is more engineering-specific information on the ASEE web site: http://profiles.asee.org/profiles/7862/screen/23

The majors at Princeton are known as Concentrations. Engineering Concentrations awarding a B.S.E. are from six departments (https://engineering.princeton.edu/undergraduate-studies/concentrations):
[ul]
[] Civil and Environmental Engineering
[
] Chemical & Biological Engineering
[] Computer Science
[
] Electrical Engineering
[] Mechanical and Aerospace Engineering
[
] Operations Research and Financial Engineering.
[/ul]
With the possible exception of Financial Engineering, these are pretty standard.

The “Certificates” you reference are equivalent to minors (https://admission.princeton.edu/academics/certificate-programs).