@CTDadof2 leave it on the list definitely. You visited, you were impressed - that’s all that matters. My S was shunning a school because it was the lowest ranked on his list - didn’t want to visit. In the end, he was torn between this college and another. He was so impressed with the facilities, the teachers, the students - he admitted that it was just so much better than he expected.
USNWR and other similar rankings are huge commercial successes because they made many people believe that the college evaluation process can be simplified down to a single number (a college’s position in the rankings), so that they don’t have to do the analysis themselves. Unfortunately for these people, the factors USNWR (and others) have picked for the rankings aren’t necessarily the ones matter to them. Even trickier is how the factors are weighted. These weights ought to be different for different applicants, and various factors may be more important, less important, or unimportant to some applicants. A single number can’t capture this complication, but without it, USNWR (and others) wouldn’t be able to sell their rankings.
“8 of those schools rank in the top 15 slots in the category, while the 9th ranked in the 30s.”
So I quickly perused the rankings and am pretty familiar with one top-10 school, Marist at 8, and I picked another, Rider, at 35 to try and replicate your question. I don’t think there’s all that much difference between them, so don’t discount the 30s college based on that. Also as data10 posted, try and dig into the criteria and why one was ranked higher than another, and if those factors are important.
“Personally, I’d give greater weight to the graduation rate”
I’ve never understood why graduation rate matters. Easy school high graduation rate.
Graduation rate of a college is strongly correlated to admission selectivity and students without financial limits. So it is mostly a proxy for exclusivity and connections to wealth, without explicitly saying so.
Graduation rate performance relative to the graduation rate expected from the student cohort may be a more relevant factor, but few pay attention to it (USNWR gives it a lower weight than raw graduation rate).
@Greymeer Actually while logic would dictate that the easier the school the higher the graduation rate, this really isn’t true. It’s the more competitive schools that top the charts in high graduation rates. For instance, University of Arizona has a 60% graduation rate and Harvard has a 97% rate.
Easy school means no job because you were poorly educated and not competitive upon graduation. Those schools don’t last long.
What qualifies as an “easy school”? For example, Brown has no specific core curriculum courses required for graduation, aside from 2 writing classes. Students at Brown can take most classes pass/fail, and failing grades are not recorded. Among graded classes, the vast majority of grades that are given are A’s. The trend on gradeinflation.com suggests the mean GPA at Brown should be between 3.7 and 3.8 today. Does this make Brown an “easy school”? If Brown is an “easy school”, you wouldn’t know it based on graduation rate or employment. Graduation rate and employment stats are similar to most other comparably selective schools, with similar major distribution.
As others have stated, a college’s average graduation rate is largely a function of selectivity and wealth, not whether the college is “easy”. Several studies have explained ~80% of variation in different colleges’ average 6-year graduation rate (0.9 correlation coefficient) by just looking at a combination of selectivity and wealth. One example is at https://eric.ed.gov/?id=ED571993 . The study does find that colleges that are more focused on STEM fields tend to be lower than predicted by selectivity and wealth, but I wouldn’t call this the same as being an “easy school.”
Surveys of employers suggest the school name has relatively little influence when comparing applicants for jobs, aside from being known. As a whole they are certainly not focusing on whether it is a “easy school” and are instead focusing on things like relevant experience, desired skill set, college major, interview performance, a variety of soft skills, etc. For example, the survey of hundreds of employers at https://chronicle-assets.s3.amazonaws.com/5/items/biz/pdf/Employers%20Survey.pdf found college reputation was the least influential factor as a whole for evaluating resumes of new grads. Unknown or online colleges were an exception, as I’d expect would be non-accredited colleges.
OP- you are wrong. You started this thread because you do not know as much as you want about colleges. Class size can be misleading. A small class led by a mediocre teacher/professor is not as good as larger ones with top notch faculty. Why bother asking the question if you already are fixed in your opinions?
and IMO a large class led by a top notch professor is generally not as good as a small class led by a top notch professor. That is not to say there isn’t a place for large lectures. Not every class gets a lot of benefit from a low student/teacher ratio.
We’re all entitled to our opinions. @CTDadof2 thinks class size is important. @wis75 does not. That doesn’t make either authoritatively right. Class size is something that is part of the USNews ranking methodology.
Have you looked at Forbes? Forbes considers student outcomes in their ranking.
The rankings at Forbes can be quite different for the suggested category rankings. For example, USNWR gives Scranton and Bentley similar rankings in the regional category. However, Scranton doesn’t make the top 250 in Forbes, while Bentley is #124.
This doesn’t mean USNWR got it wrong and students should avoid attending Scranton. Instead I’d suggest not focusing on how 3rd parties rank colleges and instead focusing on what criteria is important to the individual student, which will likely be completely different from both USNWR and Forbes methodologies. These types of methodologies usually intentionally try to make the rankings be reasonably close to reader expectations, in an effort to give the appearance of validity, when the actually methodology is arbitrary with no measure or test of validity. This type of 3rd party ranking list can be useful for discovering new colleges you hadn’t considered, but I certainly wouldn’t choose college x over college y because of rankings.
What I found useful about the USNWR rankings is that they’re a rough proxy for admissions selectivity. A student can find match schools 10-30 places down from their dream school or likelies another 10-30 down from those (adjust the numbers as necessary for your particular kid). Often these schools, particularly the LACs, are ones families don’t know about because they’re not located in their region. From there you need to do your homework and not assume that a school ranked 27th is somehow better than one ranked 32nd in any meaningful way.
Having gone to both huge and tiny schools, in every case, I was happier with the smaller classes in the smaller schools than the larger classes in the larger schools. I had plenty of fantastic professors in the small classes, while I had too many boring drones in the large lectures.
There’s a personal connection to the teachers in the smaller classes that made a huge amount of difference to me. If I missed a class at the large school, the teacher wouldn’t notice nor care, and I started feeling that going to class wasn’t important. If I missed a class at the small school, the teacher would inquire as to where I’d been. I felt more invested in the smaller classes.
There are many different ranking systems, each with its own methodology. Some are more quant based, using real data (income 6 months after grad, income 5 yrs after grad, graduation rates, etc.) Others use a lot of surveys (very subjective). However, what we found interesting was that, in general, there was a certain cohort of schools that surfaced near or at the top of every list. Although S didn’t base his decision on that, he certainly did use that information to create a list of schools in the geography that interested him (had no interest going out west so UCB and Stanford were never considered - but they’re highly ranked, didn’t want to be in NYC so Columbia and NYU were never considered, etc.) Other than your state safeties, why wouldn’t you consider the various rankings?
People get very defensive / protective about “their school” , especially if it’s not prominently ranked on most of the systems. Certainly doesn’t mean the school isn’t good or as good. That said, it also doesn’t mean you should discount the rankings and conclude that highly ranked schools aren’t as good as school X.
trust what you see and take the rankings with a grain of salt. Our older two went to the same university and we’ve watched it fluctuate between 35 and 25 on the us news national LAC chart the last 5 years. The school didn’t change in anyway measurable to our students.
If your son likes the school and it has the potential to be an affordable choice, let him apply. A lot changed between now and next May.
I think much of the same colleges surfacing near the top of every list is by design. If a ranking list did not have HYPSM… type schools near the top, then readers would be more likely to dismiss the ranking as garbage, resulting in less advertising money for the ranking maker. Washington Monthly used to suffer this problem. Their rankings used to focus on things like difference between actual and predicted grad rate, social mobility for low income students, and percent giving back to the country. Back in 2015, the top 4 with this criteria were:
2015 Washington Monthly Rankings
- UCSD
- UCR
- Texas A&M
- UCB
The rankings didn’t match expectations, which led to many dismissing them as garbage. So over the past few years, Washington Monthly has gradually changed their methodology, each year getting the top rankings more HYPSM like. The current rankings are the first ones for which the top 5 were HYPSM.
2018 Washington Monthly Rankings
- Harvard
- Stanford
- MIT
- Princeton
- Yale
You mentioned income soon after graduation. Forbes uses Payscale salary as a component of their rankings. Payscale lists the following top 10 for income soon after graduation. Obviously looking at income soon after graduation is biased in favor of colleges that have a large percentage in majors associated with a higher salary, and Payscale self reported is non-ideal, but the point is it’s not the expected list of colleges at the top.
Early Career Salary with Bachelor’s (Payscale)
- Samual Meritt (Nursing)
- Harvey Mudd
- MIT
- Caltech
- Charles Drew Medicine
- Olin
- USMMA
- Albany Pharmacy
- Webb
- USNA
… - Yale
- Brown
- University of Chicago
Similarly, if you look at the colleges with the highest graduation rate as listed in IPEDS, which corresponds to a criteria used in both rankings, it’s also not the usual list of colleges, as summarized below. You only get the usual list of colleges if you weight these components and others in a specific way…
**IGraduation Rate (IPEDS) **
…
39. Stanford
49. University of Chicago
83. Caltech
You get a ranking far more similar to USNWR/Forbes/new Washington Monthly/… if you instead look at measures more directly correlated with selectivity or wealth of the college. For example:
Largest Endowment
- Harvard
- Yale
- Stanford
- Princeton
- MIT
Lowest Admit Rate (Among Academic Colleges)
- Stanford
- Harvard
- Princeton
- Columbia
- Yale
6, MIT
If I considered these important in how good a school is then I would use them, for me they are social issues that have nothing to do with getting a quality education so I wouldn’t consider them, but that is JMHO. Mine do align somewhat with USNWR but I don’t like all of what they use.
- Graduation rate - 4 year
- Quality of the cohort (peer student scores/grades/other)
- Quality of the faculty (recognized as at/near the top in there field)
- Class size (percentage of classes under 20)
- Research/Intern opportunities (do they manage a National Lab?)
- Diversity (not racial but cultural)
These are things that determine fit IMO
Things I don’t care about:
- The weather
- Alumni giving
- Social issues (IRT to the institution)
@Sue22 - Agreed. There are parts of the US News rankings that I don’t believe are very relevant and/or biased (such as alumni giving that will always hamper public universities). However, I think the public gives more weight to the US News rankings because they do reflect selectivity fairly well and, let’s face it, they generally “make sense” based in real life. Very few people take a ranking that doesn’t have Harvard/Yale/Princeton consistently at the top very seriously, so when they see the US News rankings have those schools at or near the top every year while they generally don’t have many or any “strange” schools in its top 25, it’s actually comforting in a way. In essence, people look to the US News rankings more than any other because those rankings confirm the general public’s preexisting beliefs at the top.