Retention and graduation rates are largely byproducts of admission selectivity and student financial resources. Academically stronger students are less likely to flunk out, and those from wealthy families (like about half of those from the USNWR top private universities and LACs) are less likely to drop out due to running out of money.
A given student’s actual likelihood of graduation will be based more on the student’s characteristics (i.e. their own academic strength and whether they have enough money (including scholarships and financial aid grants) to pay for the college), rather than the college’s overall graduation numbers.
Agree, but a significant part of a 4 year college or university educational experience in the US is in making connections with one’s peers & progressing together through school & during one’s early career years.
US News helps prospective college students identify where other strong, academically motivated students attend school through retention rates & graduation rates.
As I wrote above, US News ratings & rankings can be nit-picked, criticized, & challenged from many angles, but there is no better comprehensive rating & ranking system of US four year degree granting institutions of higher education. For more tailored preferences, there are a number of alternative resources for one to use.
Lockstep progression of one’s entering class through graduation is probably the experience of only a minority of college students. Most colleges do not have undergraduate cohorts with high minimum academic strength and low financial stress that would be the prerequisite for keeping entering classes together through (on-time) graduation.
Progressing together is quite common at LACs & at National Universities. But, of course, much depends upon one’s major as many engineering students, for example, often take 5 years rather than 4 years to graduate. But progressing together with one’s freshman class is very common at National LACs.
I’ve suggested a better method many of times during this thread. Look at how colleges do at the individual subcomponents that are important to you, rather than how colleges due in an arbitrary weighting of various numbers that do not correspond to your values.
It is not necessary or practical to create a formula that outputs “best” college overall, for everyone. There is no way to verify that such a formula is accurate aside from seeing that it outputs whatever particular colleges you previously believed were “best.” I don’t think USNWR is better than various other college ranking systems unless by “better” you better correlated with a particular set of colleges appearing at the top.
If it is important to know various criteria, then you can and should look up any of the criteria that is important to you. Instead looking at the output of a ranking formula that includes small weights to a portion of the criteria that is important to you and combines them with small weights in other criteria that is not important to you, but is more correlated with HYP… appearing at the top, is not likely to produce a result that fits with your personal values
It’s also important to understand why a particular college does better/worse in those criteria, rather than just looking at the output of someone else’s arbitrary formula that includes a portion of factors that interest you. For example, 6 year graduation was mentioned. If one looked up 6-year graduation rate, the highest ranked colleges on IPEDS are as follows:
Highest 6 Year Graduation Rate
1 . Princeton – 98%
2. Harvard / Yale – 97%
4. Brown / Olin / ND / Penn – 96%
8. Amherst / Columbia / Cornell / Dartmouth / Duke / Emory / Georgetown / MIT / Chicago / UVA / WUSTL / Williams – 95%
20. BC / Caltech / Northwestern / Stanford / Swarthmore / Tufts / W &L – 94%
,
58. GeorgiaTech – 90%
There is an obvious correlation with selectivity and being a private college. A college has a high graduate rate if they are selective enough to admit students who are academically capable of graduating, and admits students who are unlikely to need to leave college for financial or family reasons. If a particular student is deciding between colleges, looking at this graduation rate doesn’t tell him/her at which college he/she is more likely to graduate. For example, if an exceptional student from a wealthy family chooses to take a full merit scholarship to Alabama that doesn’t mean his change of graduating suddenly drops to 68%, like the average Alabama student. Instead he will probably have approximately the same chance of graduating had he attended Princeton with a 98% grad rate.
When a college does have a higher or lower graduation rate than expected based on selectivity, it is good to try to understand why, rather than just plug it into a ranking formula. For example, I previously mentioned that Stanford has a relatively lower graduation rate (94% vs 97-98% at HYP) because ~30% of students are in a 5-year co-terminal master’s program. This is a great program and a good reason to choose Stanford over HYP. However, just plugging graduation rate into a formula misses this. Along the same lines, GeorgiaTech’s 90% graduation rate is lower than expected based on selectivity because more than ~1/3 of students do a co-op with a 5+ years expected graduation time, which is again a great program and good reason to choose GT over peers. I imagine being public, GT are also are more likely to need to leave/delay for financial/family reasons that certain colleges with a much wealthier student body. Again plugging graduation rate in to a formula misses all this. Instead it’s best to look at and understand the particular criteria that is important to you.
No one uses USNWR for its college retention rates, or graduation rates, or some other data. USNews isn’t a data provider because it has little to offer in that regard. It uses mostly publicly available data, some of which many college applicants may not care about, and weigh them in a way that may be totally inconsistenet with applicants’ priorities, to come up with a single simplistic score. No doubt simplicity sells. It’s great for USNews, but isn’t so, unfortunately, for college applicants, as college selections aren’t so simple. I, for example, care about neither retention rates nor graduation rates. Other people may care about them, but perhaps not faculty compensation, and so on.
Good for you. But I suppose most people don’t use USNWR that way. They want to know who is number 1, or 2, or 3, or 10, and who’s up and who’s down. If USNWR were to provide only the data, no one would have any issue with it.
Regarding alumni giving as a measure of the school, yes it may be driven by how well organized and cohesive the college alumni group is, but that is also an indicator of a strong network and should be a consideration for applicants as well. Alumni giving provides additional student resources, internship funding, stipends beyond the endowment numbers. It also helps with the job search and mentoring for grad school, etc.
There’s a difference between finding any data used by USNWR useful and finding its rankings useful. Different people will find different sets of data useful to them. I even used its college map to help plan college visits with my son a few years ago, but its rankings had little relevance to his college selections.
Unfortunately USNWR ranking has noteworthy real world consequences, even when it is ignored. For example, suppose a magazine decided to publish a ranking of best movies based on a USNWR like weighted formula, similar to below.
20% Questionnaire to Movie Production Admins
18% Box Office Gross
15% Budget
12% Average Central Cast Member Salary
8% Number of Theaters Playing Movie
7% Number of Days Playing in Theaters
5% First Weekend Ranking
…
I doubt that this formula would produce an accurate ranking of which movies a particular reader will like best, and it would probably be biased in terms of well known familiar blockbusters that many magazine readers expect to see on top. But I also doubt the magazine publication would harm anything. Instead most people would ignore the ranking and it would have little consequence.
However, USNWR ranking is different. The ranking has become well known and powerful. A change in a college’s USNWR ranking can lead to noteworthy changes in number of applications, alumni donations, yield, and various other factors that are important for the college. For example the study at https://core.ac.uk/download/pdf/144981976.pdf found that if a college slips 1 place in USNWR ranking, it can have a statistically significant negative impact in future classes on admit rate, yield, average SAT, quality of FA, and net price after FA.
Some colleges make changes targeted to increase their ranking on USNWR, rather than targeted to improve college quality. Some colleges fear making desired changes to improve the college because it may negatively impact their USNWR ranking, such as when Sarah Lawrence was removed from USNWR ranking following going test blind. Sarah Lawrence eventually gave up and switched to test optional, so they could be ranked.
USNWR is not just a benign meaningless ranking. It’s doing real damage, even when a particular reader chooses to ignore it. I think colleges as a whole would be better off if the “best” college rankings didn’t exist and instead the USNWR website only included CDS like facts and/or rankings of well defined subfactors instead of “best.” For example, listing which colleges have highest graduation rate or highest SAT score, rather than implying that highest graduation rate or highest SAT score makes a college “best.”.
Rankings play into our insecurities as parents. They provide prospective buyers of a service that is hard to quantify with a sense of security, a “safe choice”, a mitigation of risk. From the time kids are babies, parents look for “signs”—early talkers, excellent gross motor skills, early readers, good social skills, etc.—that their child will be a success in school and in life. If rankings were all about data, USNWR could do what Fiske or Princeton Review does and just provide information in an unranked format. The USNWR ranking is tapping an emotional need for an assured outcome when it comes to our children. We get lulled into thinking that if our kid is good enough for a top [5, 10, 20, 50…fill in the blank] school, that surely they’re on track for a great life. Whether they attend a school ranked #2 or #12 or #33 or #60 will likely make very little difference over the long haul.
Yes, US News rankings are important to many people & to many schools.
Simply because you do not appreciate or value the US News rankings does not mean that others need to agree with your position. And, based on your own words, they do not. It is somewhat akin to stating that because I do not see any value in cryptocurrencies that nobody should see value in them.
US News rankings also help employers to use their recruiting time & capital in an efficient manner.
Whether or not rankings stir up insecurities in parents or students is a personal matter.
Attending & graduating from a highly ranked school often results in more job opportunities in the early stages of one’s career. Employers are well aware of the best places to most efficiently find the best, brightest, hardest working prospective employees.
My posts have never said others had to agree with position. However, I have listed a variety of specific problems with the USNWR ranking.
I’m sure there is some particular person in hiring who is big on recruiting based on USNWR undergrad ranking, but I very much doubt that this is the norm. For example in the survey of hundreds of employers who recruit recent new grads at https://chronicle-assets.s3.amazonaws.com/5/items/biz/pdf/Employers%20Survey.pdf , the most desirable type of college for hiring new grads was state flagships, not “elite” colleges. State flagships are notorious for doing relatively poorly in USNWR ranking compared to “elite” privates. College reputation was marked as the least influential evaluated factor in evaluating resumes of new grads for hiring decisions. The employers that I am personally familiar with choose where to recruit based on things like location, past hiring successes, colleges attended by employees + network, and how well the college does in the field for which they are hiring, which is often completely different from USNWR national ranking.
I also have not seen evidence of job opportunities following USNWR ranking. Studies that have controls for individual student characteristics usually find little or no financial value in attending more selective colleges. The classic example is the Dale and Krueger study, which found that average earnings were similar among students who applied to and were accepted to a similar set of colleges, regardless of whether the student chose to attend the relatively more or less selective college. In short, the individual students and their backgrounds were the driving force for their future earnings, rather than USNWR ranking of college attended.
There are a small minority of exception industries that are especially sensitive to prestige of college name, such as investment banking, but that is not the same thing as job opportunities following USNWR undergrad ranking.