<p>Actually ranking schools based on outcomes not based on things like "prestige". Like the measurements they use and largely agree with their rankings. Sure would be nice to use something like this and get away from all the crazy games played to suck up to the US News rankings.</p>
<p>Go Whitman and Sewanee! Of course, I could probably devise ranking systems to move them higher. Heavily weight obscurity and geographical remoteness along with academics and outcomes and they would be top ten!</p>
<p>I don’t like rankings at all. However, I have to say that these ratings are worse than the USNWR rankings because they are based on some questionable criteria.</p>
<ol>
<li><p>Rate my professor.com. ANYONE can post ratings on this site. There is no way to verify that the person rating the professor even attends the university in question, much less took the class. How can you possibly use this as a criteria for rating a university.</p></li>
<li><p>Listings in Who’s Who In America are silly. You don’t have to do much to be invited to be listed. Some people who are invited don’t fill out the form. The publication does not really tell you who is successful and who is not.</p></li>
<li><p>Pay data. When did college become vocational school? </p></li>
<li><p>Average federal debt load doesn’t really tell you anything about educational quality. It is impossible to know if students have low debt loads because schools give them a lot of aid or because they have rich parents. </p></li>
<li><p>Percent of students taking federal loans has nothing to do with educational quality. </p></li>
</ol>
<p>This is just my 2 cents. I don’t place a lot of emphasis on any of these rankings.</p>
<p>My older son is a Geology major at California State University Sacramento which was ranked #398 and I thought had a good Geology department with classes that teach the Earth is 4.5 billion years old. I see now that I could have sent him to Cedarville University ranked #259 and therefore obviously a better school. Cedarville’s claim to fame it is the only Christian University that offers a degree in Geology which strictly adheres to a curriculum based on a 6,000 year old Earth as taught in the bible.</p>
<p>I wonder if the editors at Forbes also believe the Earth is only 6,000 years old.</p>
<p>Lots of problems with the methodology, but here are just a few:</p>
<p>1) Payscale.com data on salaries is a self-selected, unscientific sample, and the self-reports are unverified.
2) Payscale looks only at salary data for people who don’t go beyond a bachelor’s degree. This may distort results for schools where a high percentage of the class go on to graduate or professional school.
3) Mid-career salaries (19-20 years out) are of dubious value in evaluating where a school is today; at best it reflects educational outcomes from two decades ago, and would tend to result in lower rankings for schools that have made big improvements in the last two decades (e.g., USC, WUSTL).
4) Who’s Who listing is a non-objective, unscientific, and therefore dubious metric of career achievement, tending to promote celebrity over achievement and self-promoters over quiet achievers.
5) Who’s Who will be heavily weighted toward people who graduated 2, 3, 4, 5, and 6 decades ago and therefore reflects at best a long-term retrospective measure of educational outcomes.
6) Percentage of graduates who earn Ph.D.s will be heavily biased toward LACs and other schools that don’t have undergraduate pre-professional programs (undergrad business, engineering, nursing, etc.)
7) Not to take anything away from anyone who earns a Ph.D., but the academic job market is so abysmal in many fields that measuring how many of a school’s graduates got Ph.D.s in the 2008-2010 period used by Forbes–in the depths of the recession when academic hiring came to a near standstill–may just reflect who got the worst career counseling.</p>
<p>I wish there was a way to get the same results with different methodology. I love the idea of student satisfaction and post-college success. DS school has always done well on the Forbes list, and with good reason. The students leave better than they come in-taking B students and getting them into top grad programs, law schools, med schools, etc. Lots of future CEOs coming out of a school with a very strong commitment to a traditional liberal-arts curriculum. It is nice to see them recognized in a way that looks beyond selectivity.</p>
<p>Why do you care where your son’s school falls on some magazine’s random list? If you and your son are satisfied with his school that is all that matters.</p>
<p>I only care because it is a great school that more men should consider. I am happy to talk up the lesser-known school, especially here on CC where the same old schools get the same old press.
If these lists help anyone explore a previously unknown school that is a good thing.</p>
<p>I agree with Mizzbee. I think that a school that takes B students and spits out future CEO’s and top grad school applicants, deserves more respect than the school that takes only A+ students with sky high test scores and does the same thing.The former is A LOT harder to do.</p>
<p>My son passed up #2 and #13 for #19 and doesn’t regret it a bit; for him #19 is a perfect fit. Any list like this is mainly marketing hype and filled with the biases/preferences of its designer. A more objective list might be the cross-admit tables of accepted students.</p>
<p>It’s one odd list for sure. It’s like voting for the all star game. On all other lists my daughters school is top 70, this none its 203. And it’s top ten? Seriously? In that order?</p>
<p>Seems to fly in the face of others lists in a huge way, so as an outlier lists, makes for fun conversation, otherwise, eh</p>
<p>I agree that these lists reflect the bias of the designer, thus my joke about how move Whitman and Sewanee (two schools we like) even higher. However, I am not sure that showing cross-admits along with actual selections would add any objectivity. Preferences between very good choices can be very subjective as shown by LoremIpsem’s post above. No one list means much of anything. Whether placement in several lists starts to have significance depends upon whether the value of each list is zero, or merely sliight. For lesser known schools, these lists can be nice for spreading the word for quality alternatives to the name brand colleges.</p>
<p>We happen to think our daughter’s undergrad school is terrific, and deserves any kind of accolades it gets. It is another one of those schools that flies under the radar screen. Thank goodness for a CC poster who gave us the tip about this small Jesuit university.</p>
<p>Williams is a great LAC and the equal of any college or university in the country - depending somewhat, of course, on what one intends to major in. Moreover, while a numerical order is given, I doubt that anyone really believes that there is a material differnce between highly ranked schools separate by a few places. Where to break off tiers is quite subjective as well. Looking at the list clicking by screen shots of ten gives quite a differnt feel than scrolling down the list of 100 at a time. Each view suggests a valid grouping, yet neither really is an accurate grouping.</p>
<p>Seahorsesrock (yes they do!), Williams is ranked #1 LAC by USNews. Harvard and Yale share the #1 rank for Unis, and Stanford is at #5 (tied with 4 others). So even by US News standards Williams is “better” than Stanford. and ranked the same as HY.</p>
<p>This happens every year in the Forbe’s list comes out. People “think” without any objective criteria, that one school is better than another based on name recognition. Although the quality of a Williams education is indeed high, it’s just a ranking that, like all rankings, use some dubious criteria. If you look at it another way, it’s not much different than USNWR. It’s just that LACs are mixed in with Universities. USNWR separates them into different categories. It certainly isn’t a stretch to think that the best undergraduate education may be a LAC.</p>