Meta-ranking, Universities and LACs Combined

This list includes all schools from the top 25 of U.S. News in the two national categories that also appear within the top 50 schools from three additional college rankings (listed by entering student standardized scoring):

  1. Caltech
  2. UChicago
  3. Harvard
  4. Yale
  5. Princeton
  6. MIT
  7. Columbia
  8. Stanford
  9. Northwestern
  10. Rice
  11. Vanderbilt
  12. Notre Dame
  13. Pomona
  14. Duke
  15. Williams
  16. UPenn
  17. Dartmouth
  18. Amherst
  19. Bowdoin*
  20. Brown
  21. Claremont McKenna
  22. Cornell
  23. Georgetown
  24. Hamilton
  25. Washington & Lee
  26. UC-Berkeley
  27. Colgate

Links and comments to follow.

*Footnoted in sources as test optional.

http://www.forbes.com/sites/nataliesportelli/2016/07/06/the-full-list-of-americas-top-colleges-2016/#34e4c68d69a4

http://www.businessinsider.com/best-colleges-in-the-united-states-2016-8

http://www.businessinsider.com/smartest-colleges-in-america-2015-9

http://colleges.usnews.rankingsandreviews.com/best-colleges/rankings/national-liberal-arts-colleges

http://colleges.usnews.rankingsandreviews.com/best-colleges/rankings/national-universities

Comments

  1. The goal was to combine universities and LACs together, as well as to consider aspects of these schools that would be relevant to an undergraduate.
  2. The meta-ranking may reflect the limitations of its weakest source. Excellent schools may not be included, then, if they were improperly or poorly assessed by this weakest source. For example, colleges from which graduates seek rewarding professions rather than maximal financial outcomes may not fare well in at least one of the constituent rankings.
  3. Nonetheless, schools that performed well are likely to be highly academically distinguished.
  4. As with other ranking methodologies, even excellent or superior schools whose priorities, structures or philosophies do not align well with ranking models are unlikely to perform particularly well.
  5. The cut-offs, generally based on the range of the underlying sources, will produce arbitrary effects.

I mean Chicago is a bit low but other than that great list.

“US News” “Standardized scoring”
Garbage in, garbage out.

@snarlatron : I tried to address the GIGO possibility in my comments. Consider the list an experiment for those who might find the process interesting. Standardized scoring simply served as an ordering method for the included schools.

It’s an interesting exercise, but ultimately it remains garbage in/garbage out. I do enjoy looking at lists and rankings - we all do or we wouldn’t be here clicking on these links. My alma mater did particularly well, so it must be right. :slight_smile:

nah it’s still gi/go

The point of the experiment was mostly to inform me about what rankings indicate about rankings. What can fairly be said, then, would be that the above schools rank highly across multiple sources. The value of rankings, as they pertain to the intrinsic qualities of the colleges themselves, would appear to be a different topic.

However, it’s not beyond imagination that the list above may also comport with at least some of the substantive aspects of the included schools.

If you want to rank by USNWR criteria, why not use all criteria to re rank and combine instead of just using standardized test scores?

@ucbalumnus : The primary objective involved determining which schools appear consistently across multiple “top 50” rankings, for which USNWR provided only one fourth of the base sourcing (see post 1). The final ordering could then either have been arbitrary (e.g., alphabetical) or statistical (e.g., standardized scoring). I chose statistical in order to better conform with the statistical bases of the sources themselves. An school’s appearance on the list, by itself, indicates a high absolute ranking indices.

^ Correction: “A school’s appearance on the list, by itself, indicates a high absolute ranking across several sources.”

Using standardized tests as the only means of ranking schools exaggerates the importance of such to the exclusion of all other possible criteria that can be used.

@ucbalumnus : The standardized-scoring ordering, as stated previously, remains tangential to the intent of the list:

An Alphabetically Based Alternative

LACs

Amherst
Bowdoin
Claremont McKenna
Colgate
Hamilton
Pomona
Washington & Lee
Williams

Universities

UC-Berkeley
Brown
Caltech
UChicago
Columbia
Cornell
Dartmouth
Duke
Georgetown
Harvard
MIT
Northwestern
Notre Dame
UPenn
Princeton
Rice
Stanford
Vanderbilt
Yale

It appears that you are suggesting that the only statistical way of ranking is standardized test scores. Why? Surely, there are other ways of ranking using a multitude of factors, as used by the various ranking organizations that you pull the college names from.

@ucbalumnus : I agree that, if the ordering of the schools was of primary interest, a method using, for example, the ordinal positions from the four underlying sources could produce a more comprehensive ranking. This would, in effect, incorporate the multitude of factors you referenced.

However, my primary interest pertained to identifying the colleges that appear consistently across relatively diverse sources. The conclusion would be that all 27 schools listed above do extremely well when considering on this basis.

Your list in post #12 is better, but off the top of my head I can see that your list of 27 schools it is missing Swarthmore from the LACs and Johns Hopkins from the national universities. It’s not your fault, but this leaves me even more convinced of the garbage in/garbage out problem inherent in this kind of exercise.

Also, to highlight someone else’s point - standardized test rankings are deceptive because some schools emphasize test scores more than others. Some schools would rather take a student with a very high SAT who has accomplished little in school vis a vis his or her peers, because higher test scores make the school look better in the rankings. Other schools, UC Berkeley in particular, care much less about standardized test scores and much more about whether you have been a high achiever vis a vis your peers in high school, even if that high school was lousy.

98% of Berkeley’s entering students are in the top 10 percent of their graduating class - as far as I know, no other college comes close to that percentage, not even Harvard and Stanford. Meanwhile, there are colleges out there with equivalent or higher SAT score averages to Berkeley where as few as 70% of the entering students were in the top tenth of their high school graduating class.

I’m not knocking your efforts, but attempting to quantify and precisely rank colleges ultimately is a fruitless effort. I really wish US News hadn’t started us all down this road all those years ago.

@ThankYouforHelp : If this thread were to further demonstrate to you, or others, the lack of value of this type of analysis, I’d be fine with that. Note, though, that this has been intended primarily as a form of socio-statistics, more so than an attempt at a meaningfully precise ranking.

Not a problem. I was just expressing my views about the value of these rankings, especially the rankings from someplace with particularly unsatisfying metrics like Forbes and Business Insider.

Even my favorite magazine, the Economist, falls flat on its face when it tries to apply hard quantitative measures to rank colleges. The Economist tried to correct for cost of living in assessing salary outcomes. Washington and Lee was the highest rated college in America, in large part because a a 75k starting salary goes a long way in rural Virginia. Of course, 90 plus percent of W&L grads move away to places like DC and NYC. Moreover, the data only tracked students who got financial aid, so schools which give very little need aid but a lot of merit aid naturally benefit because the students at issue start out much better off financially. Ultimately, the ranking was very carefully calibrated and thought out, and was still garbage in/garbage out.