Lots of interesting and insightful comments.
Engineer80, graduation rate and selectivity almost certainly do not “cause” higher quality but instead are part of a causal network of relationships which all have an indirect causal interaction.
There are mutual causal effects among the following:
student ability, preparation, and motivation
realization of student potential
quality of academic instruction
scholarly culture created by immersion with fellow students
level of academic expectations
role models and mentoring provided by faculty
confidence and self-esteem enhancement instilled by academic reputation of college
overall level of intellectual sophistication at the college
professional achievements and maturity of the faculty
amount of individual attention given to students
quality of curriculum design and delivery
Perhaps others can add to this list.
For example, over time, schools with better faculty attract better students and, conversely, better student bodies attract better faculty. There is a sort of academic ecosystem which can evolve over time to create stronger and stronger academic quality (however you want to measure it).
I’ve heard that statement many times about “correlation does not imply causation”. It is incorrect and has led to widespread misconceptions. First, statistics have nothing to do with causation. It is the research design that determines causation. A correlation CAN prove causation if it is used in a true experimental design. On the other hand, there is no statistic on earth that can prove causation in a non-experimental research design. Sometimes even college statistics faculty confuse the correlation statistic with quasi-experimental and non-experimental research designs. Furthermore, even correlation statistics in non-experimental research designs can “imply” (i.e. suggest) causation. They just can’t “prove” causation
78% of Northeastern students do at least two 6-month co-ops. A minority do more than 2.
Among the 78% who do 2 co-ops, graduation time is expected to increase from 4 years to 5 years. Most do graduate in 5 years, as can be seen by the stats in the initial post with 80+ 6-year graduation rates. However, it’s also quite common for students without co-ops to take 5 years to graduate, particularly students in more demanding majors like engineering. The students who took 5 years without co-ops are now taking 6 year with co-ops. Adding that extra year distorts the comparison between schools, increasing the graduation rate for schools where the vast majority do co-ops. You are looking at the 6-year graduation rate for most schools, but the effective 5-year graduation rate for northeastern among the students who spend a year on co-ops. The 5-year graduation rate and 6-year graduation rate are expected to differ.
You need to compare the same cohort class. For example, continuing to use median SAT score as a proxy for selectivity, the median SAT scores for different classes at NE are below. Are all 3 entering classes expected to have the same graduation rate?
2012 Freshmen – Median SAT was 1370
2010 Freshmen – Median SAT was 1310
2008 Freshmen – Median SAT was 1270
As expected, the 6-year graduation rate increases over time at NE. The current 6-year graduation rate at NE is 82%, but 6 years ago the 6-year graduation rate was 70%. That’s quite a difference. When you compare the same cohort class, there is a ~5% difference between 6-year graduation rate and 8-year graduation rate at NE and GT. ~5% obviously does not explain all those who did not graduate in 6-years, but it pushes grad rate notably closer to expectations based on selectivity.
By the way, the correlation between grad rate and SAT scores is about +.8 (very high). Grad rate is highly correlated with faculty salaries (about +.65). There are low-moderate correlations between grad rate and student-faculty ratio and between grad rate and the percentage of faculty who are highly experienced (i.e. full or associate professors) and between grad rate and admit rate. I can’t recall all of the sources but most of this is in the US Dept. of Education data. I think there used to be a database that was called something like "Baccalaureate Origins of Doctoral Recipients). As I recall, schools with higher bachelors grad rates produce more doctoral graduates (from any college).
Franklin and Marshall College, another school not generally known for grade inflation, was omitted from the original list. Most recent cohort reported had an 87% 6-year grad rate.
@collegehelp - I have a Ph.D. in engineering physics and electrical engineering and I am quite familiar with the difference between experimental and non-experimental design. The vast majority of claims (in any field) that try to “prove” some conclusion are not based upon purpose conducted, properly controlled, and statistically valid sample sets in a study based upon a true experimental design. In those cases, “correlation does not imply causation” is valid. The entire college “ranking” game is essentially worthless in judging the value of a college. Graduation rates, opinions of the school from other college administrators, size of a library, size of endowments, et al, tell very little, seriously. The popular college rankings do an extreme disservice to prospective college students and their parents, IMO.
It would be interesting to have ‘exit interview’ data for students who drop out of a particular university to find out whether they left because of finances, poor fit, etc. My daughter had several friends who started at her small LAC with her who left over the years… one for mental health issues, one for finances, and a couple because they changed their minds on career path and couldn’t study nursing or something like that at the small LAC.
Engineer80, I agree with you that the vast majority of studies that claim to have proven something are unfounded because of their inability to rule out alternative explanations for their observations. However, the statement that “correlation does not imply causation” is misleading and wrong because it implies that other statistics might better imply causation. Hence, the correlation statistic gets a bad rap. The statement “correlation does not imply causation” is a way of saying that events don’t necessarily cause each other directly just because they occur together (which is true). The problem is that all statistics are irrelevant to establishing causation. Furthermore, correlation can be used to prove causation if in a true experimental design. Your point that true experimental designs are almost unheard of, especially in the social sciences, does not change the fact that correlations CAN be used to prove causation in a true experimental design. So, that hackneyed saying about correlation and causation is simply not true and it is misleading.
That said, I believe that rankings based on such things as SAT scores and graduation rates are indicative of the overall quality of the educational experience that students receive at a college. I say this based on personal experience, judgment, logic, reasoning, and intuition as much as on correlations with indicators of quality.
Transfers are not reflected in the rates and many that transfer out will graduate elsewhere. Reasons for transfers are many and probably more common today than 30 years ago. People expect instant happiness.
In IPEDS, a student can be a first-time, full-time student just once. Students who transfer prior to completing a degree are not counted as success at 150% time or otherwise at any school. The only option would be to reverse transfer credits to the original institution to receive the degree, but that would likely be beyond the 150% time. This is especially difficult for community colleges as many students seek to transfer before earning a credential.
Good question merc81. Bowdoin had a 94% grad rate. Holy Cross (Massachusetts) had a 92% grad rate. They should have been in the list. I excluded them because they don’t report SAT scores so I couldn’t do the calculations I wanted. Their high grad rates suggest that their SAT scores are also pretty high.