Duke’s biggest rise is in cost. $1,750 as listed in the October 1960 Life magazine is equivalent to $16,065.18 in August 2021, according to https://www.bls.gov/data/inflation_calculator.htm . Duke’s 2021-2022 cost of attendance is $81,488, or about five times as much.
Other costs:
College
1960 cost
1960 cost inflation adjusted
2021 cost
Note
Yale
$2,550
$23,409.26
$81,575
Most expensive in list
Rice
$1,055
$9,685.01
$71,745
Least expensive in list; tuition-free but White-only in 1960; changed both in 1963
Of course, without credible, publicly available sources, CC would be dependent on hearsay, anecdote and unsupported opinion. Moreover, there’s a CC subset who value historical context. The Life article, for example, offers a perspective that extends beyond information on colleges to — with interpretation — insight into the cultural and socioeconomic environment of its time.
The thing is that with some exceptions Field Medalists tend to emerge from colleges all over the world and most of these colleges are very rarely if ever mentioned here. Some were child prodigies and began college very early in their life. Generally speaking public schools tend to be more accomodating than privates for this population. Not a field medalist but I remember this kid entering UConn at 12 years of age because none of the many, many privates around him would take him.
Forgive me if I’m repeating someone else who brought up Malcolm Gladwell’s podcast on the rankings Revisionist History: Lord of the Rankings on Apple Podcasts I know @teleia and @Data10 touched on part of it and the episode after it about Dillard. But it’s super interesting and points out the overweighted data point of “reputation” that is compiled by the presidents and heads of Admissions of all the colleges. What an arbitrary marker! Presidents and Admins are ranking schools they’ve never been too. They’re comparing schools that they might collaborate with because they’re geographically nearby with schools where they don’t know anyone and therefore have no opinion. What a handcuff on schools that are changing/evolving! I’d love to see the rankings before the “reputation” category is factored in!
That is probably one of the better inputs into the process. They don’t fine rank them (1, 2, 3,…) the group rank them.
Forbes uses the number of sports All Stars in their rankings. Last week’s performance by Aaron Rodgers may knock UC-Berkeley off the top spot on their list.
Am I the only one that just can’t picture a president or a top admin at a university taking time to rank (what is it, 100 schools) and passing it on to their assistants? Reputation is such a silly marker.
I don’t think it’s one of the better inputs in the process, but it’s not the worst input either. A copy of an earlier post I made on the subject is below:
In theory, there is nothing wrong with an expert option, but it’s important that the “expert” be knowledgeable about the things that they are ranking and have well defined criteria on how they are supposed to rank them.
For example, the Sienna presidential rankings “study” at US Presidents Study – Siena College Research Institute has experts rank each of the 45+ presidents in 20 well defined categories. The study compiles an average ranking in those categories and a composite ranking. The “experts” are historians, political scientists, and others that are familiar with the accomplishments of each 45+ presidents and are well qualified to ranking them in the well defined criteria. The experts are knowledgeable about each thing they are ranking and are given well defined criteria on how they are the supposed to rank them.
In contrast USNWR, sends out a survey asking college administrators to rank hundreds of colleges on a scale of 1 = “marginal” to 5 = “distinguished.” The college administrators asked to fill out the survey are by no means experts on all the hundreds of colleges that they rank. They probably are only well qualified to rank only a small handful of them. It’s my understanding USNWR also does not provide a good definition of what “marginal” and “distinguished” means, so the administrators who are knowledgeable may not know how they are supposed to evaluate how “distinguished” and “marginal” particular colleges are.
I expect the end result is largely a circular feedback. The college administrators who choose to submit their USNWR survey largely base which colleges are “marginal” and “distinguished” on the colleges’ USNWR rank, particularly for colleges for which they are not familiar. If a college is higher ranked in USNWR, it is likely to be identified as more “distinguished.” The high weighting given to the survey in USNWR rankings, cements both the high and low ranked colleges respective placing.
Original research is definitely a requirement the website says "the student’s original contribution, such as a manuscript or reprint of a research publication or senior thesis, "
“Reputation is such a silly marker.”
The first USN ranking and maybe the second were based entirely on reputation. Four public colleges were in the top-13, UCB, UM, Illinois, Wisc.
However, reputation is the basis of many situations where selection between various colleges or students / alumni of various colleges occurs. For example:
Students choosing which colleges to apply and matriculate to.
Employers which have a preference for recruiting and hiring those from some colleges over others.
But reputational rankings do vary based on the situation (e.g. college major, region, etc.) and who is using reputation for such selection (e.g. high school seniors, employers for various kinds of jobs, and the people surveyed by USNWR).
If you pay USNWR $40, you can see lists of how colleges rank in specific subcategories. For example, the ranking order for the faculty resources category or financial resources category. I don’t pay this fee, so I don’t have access to the current numbers.
Based on a thread someone posted in 2019 listing the “marginal”/“distinguished” survey peer assessment results, I’d expect the following colleges to have significant moves with the score removed. The overall rankings are based on the year for which peer assessment is available – not current.
Wake Forest moves up – Not among top 50 in peer assessment, but ranked #27 overall
Berkeley moves down - 4.7 (tied for #6) in peer assessment, ,but ranked #22 overall
Michigan moves down - 4.5 (tied for #13) in peer assessment, ,but ranked #25 overall
GoergiaTech moves down – 4.3 (tied for #18) in peer assessment, but ranked #29 overall
USC moves up – 3.9 (tied for #32) in peer assessment, but ranked #22 overall
Cornell moves down – 4.6 (tied for #10) in peer assessment, ,but ranked #17 overall
CMU moves down – 4.3 (tied for #18) in peer assessment, ,but ranked #25 overall
WUSTL moves up – 4.1 (tied for #26) in peer assessment, but ranked #19 overall
I’ve posted this before regarding the peer assessment “reputation” variable in USNWR…but there are many Presidents and other high level Deans/VPs who state they give those surveys to underlings to complete, including entry level admission officers.
Any idea where to access raw reputation data? It’s not in any of the ranking views, not a filter option, and not in the additional details provided when doing a multi-school compare. I haven’t found a “download all raw data” page, though I can search more effectively when I get home than on my iPad.
I could probably slog through and pick it off school-by-school for the top 20, but certainly not 100.