There are NO perfect ranking system out there. I think everyone can agree on that. These rankings should be consulted as rough guidelines, always keeping their methodology in mind. The problem with folks who use rankings is that they don’t bother to look at each ranking source’s methodology. Depending on their differing methodologies, the results can vary a great deal. Is the methodology based on the undergrad educational quality or is the methodology based primarily on faculty research output? The latter has little relevance if you’re looking to assess the undergrad institutions. For this reason, I give USNWR a more credit than others. For one, they lay out their methodology very clearly. Second, their methodology is the most comprehensive for gauging the undergrad institutions more than any other rankings that I’ve come across. There are no rankings that can truly gauge the undergrad educational QUALITY, though, and this is where you’d have to do your own further researches to supplement these rough ranking guidelines. Along with these rankings, I also like to look at the student-to-faculty ratio and the endowment per student, among other significant things, such as whether many undergrad courses are taught directly by faculty or grad students, what academic support resources there are, career counseling, etc.
@lookingforward “Per a few on CC, the area around USC has improved. It was even said recently. That really shows how 2nd hand sources can be incomplete or even unreliable. Or slanted.”
One of my close family members works in law enforcement in LA and knows that area VERY well (does ride alongs with LAPD, works with the FBI on gang enforcement, etc.). If you venture a few blocks off campus it can get very scary, especially at night.
My point is that the Fiske guide really plays up the positives of the college and downplays some of the more important cons of attending (for USC, location). This is not a knock on USC but used it as just one example of the biases inherent in the guide. Again, I own the 2018 Fiske book but will use all the other resources available to help my kids make am informed decision for best “fit” college.
Back when I was looking for a first pass at picking colleges I went to the library and looked at a bunch of the big one book college guides Fiske and it’s ilk. I checked out the ones that appealed. Some of the guides do tell you about what popular majors are. If you have a kid with strong interests it’s easy to get suggestions. Or you might have someone like my younger kid who junior year in high school had no idea what he’d do except “not science or math”. He loved history, but did not want to go into academia. As he started looking at colleges and visiting he learned about a major called “International Relations” that seemed like a way to continue an interest in history, but with a less academic bent. He ended up as a Naval Officer by the way after working for a couple of NGOs. But if he hadn’t decided on IR, we would have just looked for medium sized research universities that were reasonable strong in the social sciences.
No one in this family looked at colleges because of rankings. The kiddos made a list of criteria that they wanted on a college. We then, as a family, helped find colleges that met those criteria.
Kid 1 was a music major and his private instrument instructor was THE most important thing he wanted to hone in on in his college search, followed by a good orchestra and director…and an urban environment.
Kid 2 wanted strong sciences, ability to play her instrument in the college orchestra, and pleasing weather year round.
Kid 1 graduated from a college that is just within the top 50… I believe it was 60 ish when he attended.
Kid 2 graduated from a masters university ranked second in its region.
Neither would have had a clue about these rankings if the school hadn’t published them in the school newsletters.
So…agree with @Blossom. Find out what what your kiddo wants in a school. Look at the criteria. Look at the strength of your kids application. Look at costs.
If it comes down to it, yes, look at the information that the rankings are based on…but not the exact number. That information is far more valuable.
Version 3 has more schools than version 1. Went from 30 to 63. With fewer schools than the Fiske guide, they are able to provide more in depth commentary and do more research. Fiske basically updates test scores and admit rates when it updates editions, but no much else changes. Hidden Ivies has more substantive changes
We used information via Collegedata , Niche and CC to find the right schools.
The area around USC is so bad that NASA decided to put one of its space shuttles permanently on display across the street from USC.
Sheesh, USC’s neighborhood is no worse than those around schools like Ohio State and Cal.
Every major ranking I have seen lists their methodology clearly. If a ranking does not provide methodology about how they arrived at the rankings, then it is almost certainly garbage. The USNWR methodology for national rankings is summarized below. It’s dominated by functions of selectivity, reputation, and having financial resources; so as one would expect with this methodology, the top ranked colleges are highly selective privates with excellent reputations among academics — in short, highly prestigious colleges. This doesn’t strike me as a good way to gauge undergraduate educational quality and certainly not quality of specific fields of study.
22.5% – Graduation Rate – Primarily a measure of selectivity, by admitting students who are highly likely to graduate
22.5% – Reputation Survey – How many surveyed academics and counselors indicate the college is “distinguished”
12.5% – College Selectivity
10% – Financial Resources
8% – Class Size
7.5% – Graduation Rate Performance
7% – Faculty Salary
5% – Alumni Giving
“This doesn’t strike me as a good way to gauge undergraduate educational quality and certainly not quality of specific fields of study.”
That’s right, no ranking methodology can gauge undergrad educational “quality,” as I have noted in my earlier post. The best that the ranking methodology can do, including the USNWR’s, is to gauge the “institution.” As for quality, one has to conduct supplemental researches according to one’s own desirability in an institution. Again, no ranking is perfect. Each ranking is a mere rough guide. Although I could revise the USNWR’s ranking methodology as I don’t agree with its metrics and their percentage weight, I still haven’t seen anything else better in gauging the higher education institutions. If anyone wants to devise a perfect metrics and weight in gauging the institutions of higher education, do so by all means.
USNWR seems no less arbitrary or more useful than various other rankings to me. It does appear to place a greater weight on prestige correlated metrics than most others and less on outcomes beyond admitting students who are extremely likely to graduate, so it tends to be well accepted among groups that are big on prestige. However,many of the inputs used to create the rankings are not free to view, reducing its value for people who are interested in how colleges do in a particular criteria that is important to them.
I’m going to throw in a bone for The Princeton Review’s ranking, especially the more specific and niche lists. I think some of their rankings like “Students Spend the Least Time Studying” or “Happiest Students” help place information a student has gathered into a larger context. As a personal example, I saw that the university I transferred out of appears on several negative lists, including “Least Happy Students”. That made me go, “oh, okay that’s not just me” and probably would’ve made me dive a little deeper before I committed to the university in May 2015.
Of course, these are biased rankings so I would not use them as the sole source of best schools/worst schools but I like the purpose they serve.
Edit: It only uses data from 382 schools for one, so it lacks breadth of USNWR and others. But I still like it!
Does your school have Naviance? We were able to set search parameters to come up with a broad list and then look at important details about the school and visit the school websites. I found that more useful than any of the guides.
I have not read all the posts, but agree with several comments. I have a lot of experience with looking at lists, guide books, etc… I personally think a single list, on its own, is generally pretty useless, becasue they all emphasize different things and use different methodologies. if you look at several lists though, you can start to get a picture of how well-regarded a college is, which is much more useful than just looking at a ranking list. If you want a general idea of “well-regardedness”, sure, look at USNWR, Forbes, Princeton Review and others. I personally like the rankings lists in Niche, because they are based on student satisfaction in addition to other factors.
At any rate, you should not choose a college based on a list. Fiske Guide is great. I’m using my 2015 copy again for my rising senior. Princeton Review also provides good info, as well as fun ranking lists for topic such as top party schools and best alumni networks. Do not make the mistake of just looking at the top schools innUSNWR and assume that is the best way to create a list. There are a lot of great schools that don’t rank as highly as they could, for various reasons. Some schools refuse to give USNWR the data they require, so they get no ranking, or a low ranking. Some schools are test-optional, or have small endowments, or have an extremely high proportion of engineering grads, all of which can skew rankings.
Basically, if you really want to do a good job creating a list, there is no shortcut for doing your own research, except for a private college advisor, or trusting your high school guidance counselor.
But the methodology can feel arbitrary and there to make you feel good about using some ranking, as if it’s oh so authoritative. Who measures “happiest students?” Or who spends the least time studying? It’s a matter of who responds. Early on, my college got a top mark (maybe the highest) for “most wired.” So what? Every school was wiring up.
I think if someone is interested in a real academic run for their money, a ranking familiarizes you with names. It doesn’t matter what peer profs think, unless they’re in your field. Same for research funding. You have to dig deeper than what some media outlet says.
And in the end you shouldn’t focus just on, say, top 10, because your kid is top in his hs. It’s like only looking at the tip of the iceberg. And when the top holistic colleges look at your match, they don’t stop at class rank. Nor do they want to hear, in a Why Us answer, that you “want a top school” or some of the other bullets.
My girls ignored rankings and relied on Colleges That Change Lives
I like Kiplinger’s…esp. for their “Best Value” schools.
I think also that looking for admitted students SAT scores gives you a good idea of how that school rates and how your student woudl fit in.
You have to first sort of know what things about colleges are important to you. If your kid doesn’t want LACs, you can eliminate them from any rankings. For our kid, proximity to home, nice weather, strong programs across the board, decent quality of student life and prestige, alumni support, internships etc were important. If we couldn’t find one, was going to seriously consider an OOS Honors College with merit money.
Look at various rankings, such as USNWR, Niche, World Times etc. In short, unless the college really stood out as a good fit, we were seriously going for money and decent education at an OOS Honors College.
We didn’t use rankings either. Only one of mine applied to a reach school that is oft mentioned and he was deferred. By the time he was denied he has already deposited elsewhere.Rankings were not the number one factor for 2 of my 3 boys. With the third it was all about UofM because of high school prestige more than any national rankings and I doubt he knew whether the other engineering schools we looked at ranked higher or lower to be honest. But there is so much good data out there to use mentioned in threads. Or use USNWR or the World Rankings if that floats your boat and rankings are your number one criteria. If it’s not your number one criteria I wouldn’t get too concerned about which one is better than another.
Well here’s how we use rankings and why I asked. As with many of y’all, my D does well in school and gets mail from all over. She’s not a 1600 SAT kid but she’s a good well rounded student and is #2 in her class of 191. She wants to go on to med school and therefore needs a quality undergrad school. I don’t need USNWR to tell me that Dartmouth or Chicago is an outstanding school. But we’ve gotten mail from Rose-Hulman Institute in Terre Haute, Indiana. I’ve never heard of them. Their engineering program is among the best in the country. Thank you, Niche or USNWR or whichever you like. If she was interested in engineering, I’d have learned of a school we needed to check out. Same thing goes for hundreds of fine schools around the country. With close to 4000 colleges in this country, there’s no way you can know about all of them and there are some hidden gems out there that might fit your kid perfectly. That’s what we use rankings for; as a starting point to see whether they are worth looking into or not. If a school gets an overall C on Niche or is in the bottom half of USNWR, we’re not going to spend a great deal of time looking at it. Or, if it’s biology department is weak because it’s a strong music school or engineering school, chances are it’s not a good fit for us.
I get the impression some of you think I asked because we’re choosing between school X and school Y because someone ranked X 25 and Y at 30. Not the case, I assure you.
Neither of my kids looked at rankings. They didn’t want to read any college guides. We parents, one of whom is a career academician, went with our knowledge of the kids’ talents and interests, and our knowledge of the college landscape. I consulted the standard guidebooks (US News, etc.) but the lists of schools weren’t based on published rankings.