All good points. Again, not suggesting use of any particular list as a selection criteria but when viewing many and seeing the same schools named top of x, one starts to get a picture of the place.
When we went through this several yrs ago, we started making a fit list (geography, size, major, etc.) and then looked at many lists. It was amazing to see the same schools highly ranked, using different criteria. To me that suggested either there was something (whatever that means) to the results or the schools were paying hefty marketing fees to be listed as such (kidding of course - maybe).
So we distilled the various lists (about 10) and mapped it to the fit criteria. This eliminated some great schools because S didnât want to be in urban Philly as an example (so no Penn) or out west so no USC.
The point is the lists gave us some basic direction. Of course S applied to a safety which appeared on virtually no lists but is still thought to be a âgood schoolâ.
Weâre getting a bit off topic but I just offer my opinion because I think OPâs list is somewhat relevant as a singular data point. Add several other lists and you get an idea of the school aura and community. Then you go from there. If a lucrative career is part of a kidâs objective, I wouldnât go to one of these schools simply because theyâre on the top 20. However, Iâd be a little suspect if after doing more digging, they werenât sufficiently represented. That would make me wonder about career focus of student body, career recruiting, etc.
The stats donât make it happen, but why would you favor a school (outside of fit) that doesnât show up on the radar?
Iâd suggest paying attention to the methodology of the ranking list and what they are measuring, rather than just which names tend to come up on a lot of lists. As I touched on earlier, the same colleges appearing on a lot of lists can simply relate to being selective enough to admit quality students. This doesnât mean a particular student is likely to have better/worse outcomes at that college. It may only mean they are admitting students who are likely to have good outcomes, where ever they attend.
There can also be issues with ranking list makers adjusting weightings to make the usual names come up on top, so the list looks more accurate, and they sell/earn more. For example, Washington Monthly rankings used to suffer this problem. Their rankings used to focus on things like difference between actual and predicted grad rate, social mobility for low income students, and percent giving back to the country â things that HYPSM arenât best at. Back in 2015, the top 4 with this criteria were:
The rankings didnât match expectations, which led to many dismissing Washing Monthly rankings as garbage. So Washington monthly adjusted weightings and methodology to make HYPSM come out on top. In 2018, the top 5 were as follows. They still have colleges like Utah State and UCSD in the top 10, but putting HYPSM on top make the ratings look more accurate to some people.
Far more useful, is to focus on the criteria that is important to you in college. If focusing on specific criteria, rather than general prestige or general selectivity, the best colleges by this metric of what is important to you is unlikely to resemble any 3rd party college ranking list.
My main point is really that this list is useless for people who are deciding to attend a college based on how much they expect to make when they graduate. The reason that most of these colleges have so many very wealthy alumni is because so many very wealthy people attend the college.
For low income students, the best lists are those based on the Chetty data - the ones with the highest number of low income students who ended up middle class or wealthier. The fact that most poor kids who attend Harvard end up much wealthier is not relevant, considering that maybe 0.1% of all kids from the bottom 40% will attend an âeliteâ college.
Those lists are really nothing more than additions to the âbragging rightâ of the very wealthy who attended those college, as well as for the colleges themselves. Good for them, but it isnât really relevant to anybody outside of the top 0.1% by income, and even among those, only a minority really care about this.
Deb, Iâm late to this thread, so this post will likely be out of order.
Just for the heck of it, I looked up Caltechâs salary for graduates. In 2017, the average was $105,500. One third of the grads went straight to graduate school, which I suspect lowers the average income.
Not surprising, given that Caltech emphasizes STEM fields.
Not certain, but my impression is that grads who go straight from undergraduate school to graduate school would not affect the average income of the school.
A school with large numbers of grads who enter low paying fields will, however, affect the average income of the school. For example, Northwestern University has a substantial number of graduates who enter the low paying fields of journalism and in theater related professions. Northwesternâs average salary is affected substantially by these industries / occupations and, accordingly, only averages $100,812 which places it at #21 of this list of 25 private schools in @deb922âs post #16 or #17 above.
Additionally,as a Midwestern located university, salaries are lower for grads who do not relocate to one of the coasts due to a much lower cost-of-living which is reflected in salaries, but no adjustment is made by the group which produced this study of â25 Private Colleges Whose Graduates Go on to Earn the Most Moneyâ.
If the theater & journalism graduates of Northwestern University were not included in the average salary calculation,then Northwestern University would probably be among the top 10 on this list ahead of #9 Notre Dame.
CalTech may not be on the list due to its small enrollment. If included, CalTech would be among the top 10.
It does, at a school like Caltech. Many of its best students, even in some of the most employable and highest paying fields, go on to grad schools. Those students would likely earn higher-than-average income had they gone straight to industries instead.
Note that they arenât measuring average income directly. They are instead assuming average earnings for particular job titles. For example, if Caltech has x% of grads that fall in to the OEWS job grouping âphysicistsâ, then all of those x% of grads would be given the OEWS median salary for that job grouping, which is $130k. The particular alumniâs tax/self reported earnings would not be considered.
Itâs not clear from the wording that they are limiting to new grads, or that persons attending grad school after receiving their bachelors are excluded, so I wouldnât assume this is the case.
The analysis only included private colleges that have over 1000 profiles in the database, so it is possible that Caltech was excluded because they did not reach the 1000 threshold. Itâs also possible that Caltech was excluded because they were not among the top 25. They donât appear to publish the database online or have any sort of journal publication â just a news article, so this makes details unclear.
As noted above, earnings are based on national average for the job title. Location is not considered. A Northwestern alumni working in the midwest with job title âsoftware engineerâ is assumed the same salary as a Stanford alumni working in Silicon Valley with the job title âsoftware engineer.â
Caltech has more graduate students than undergrads. If the analysis included PhDs (Caltech doesnât offer standalone masters degrees), Caltech would be more likely to have met the threshold.
Itâs clear from the wording that they are looking at alumni of the bachelorâs degree college , but itâs not clear that they excluded the many Caltech kids who pursued graduate degrees after obtaining their bachelorâs. For example, excluding Caltech kids who did BS at Caltech â MS at MIT.