<p>“rjk, I believe this is the first year that the USNWR used incorrect data for Michigan. The correction could actually give Michigan a boost in the USNWR ranking, but to #27.”</p>
<p>I know it’s silly, but I would take the correction and have USNWR report it. Unfortunately, whether we like it or not, many students and their parents take these rankings very seriously. That Michigan is slowly dropping in the rankings, while lesser schools are rising, could set a precedent for the future in terms of quality of applicants. So far it doesn’t appear to be happening, but that is not to say it won’t. </p>
There are clearly basic flaws with the US News financial resources and faculty resources metrics. But even playing by their rules, there are still big questions about how it computes its faculty resources ranks.</p>
<p>Faculty resources are based on six factors –
%Classes < 20 (30%)
%Classes > 50 (10%)
Student/faculty ratio (5%)
%faculty with highest degree (15%)
%faculty who are full time (5%)
Faculty salary (35%) adjusted for region and cost-of-living indexes</p>
<p>Despite having the lowest student/faculty ratio (15:1), Michigan is ranked #68, behind eight public universities: UC-Santa Barbara(#28), UCB(#32), UCLA(#32), UVa(#32), UC-Irvine(#38), UC-Davis(#55), UIUC(#55) and Purdue(#61).</p>
<p>Michigan matches UCLA closely in all six categories, yet it is 36 places behind UCLA:</p>
<ol>
<li>UCLA (50%) > Michigan (47%)</li>
<li>Michigan (18%) > UCLA (22%)</li>
<li>Michigan (15:1) > UCLA (17:1)</li>
<li>UCLA (98%) > Michigan (90%)</li>
<li>Michigan (93%) > UCLA (90%)</li>
<li>UCLA 2-9% higher than Michigan (NOT adjusting for region and cost-of-living)</li>
</ol>
<p>What’s wrong with this picture?</p>
<p>p.s. UNC would have more to complain. It’s also ranked #68 with 13:1 student/faculty ratio.</p>
<p>“Where do you think Michigan should be ranked on USNWR?”</p>
<p>Not rjk, but if the USNWR audited data for accuracy and relevance, and actually rated universities properly for each of the criteria of its methodology, I think Michigan would be ranked between #10 and #17. I think UVA would be ranked in the top 20 too. UNC and UCLA would both be ranked in the top 25. Michigan and UCLA have a lot in common, but Michigan’s endowment gives it the edge. </p>
<p>I agree rjk. With the exception of Cal, UDub, UVA and UNC, I think most public coastal public universities are rated fairly. Cal should be rated higher (definitely among the top 10) and UNC and UVa should both be rated slightly higher. UDub should be ranked among the top 30 but barely crack the top 50.</p>
<p>UCLA seems fairly rated in or around #20. UCSD is right where it should be, as is Georgia Tech. William and Mary is harder to rate.</p>
<p>Non Coastal publics are not ranked as fairly. I think Michigan should be ranked among the top 20. Texas-Austin. UIUC and Wisconsin-Madison should all be ranked among the top 30 and they are not even ranked among the top 40. Penns State, Indiana and Minnesota should all be ranked higher too. Ranking the University of Miami and Northeastern University in the same league as Wisconsin or Texas is truly misleading. </p>
<p>“p.s. UNC would have more to complain. It’s also ranked #68 with 13:1 student/faculty ratio.”</p>
<p>Malcolm Gladwell has an interesting spin on class size in his David and Goliath book: his argument is that no subject has been studied so deeply with so little usable output. The automatic assumption is that fewer is better, but he makes a convincing case for the use of a “inverted-U” curve: the argument is that resources brought to the task in the teaching domain are somewhat offset by the interaction of the pupils. The further argument is that too few students can lead to a lot of silence and ego protection; too many is chaos. The supposition is that 20-40 is the right number.</p>
<p>Given the leading comment that this is a widely studied phenomena that results in little usable output, using a low ratio is akin to rattling gourds or casting dice to find the answer…yet further evidence that USN&WR does the consuming public few favors when it uses a metric which is just as arbitrary as its other metrics. In short, there is no reason to think there is a magic level for the ratio and lower is not better.</p>
<p>One poster noted above that the slide in the UM ranking is consternating due to its likely influence on the credulous, and I agree…if people believe in fairy tales and/or magic you have to deliver both. That is the bad news. The good news is that parents seem to be smart enough to “look through the numbers” in that: 1) applications have gone up even as the ranking has moved from 24 to 29 (or whatever); 2) the quality of each entering class is higher than the class before as measured by GPA (somewhat stable to up) and average ACT (which seems to bump up about 1 point for every 4 year cycle).</p>
<p>So while I’d like to see changes in the USN&WR ranking for cosmetic reasons, there is no small irony that in the case of Michigan it has achieved the inverse of expectations: rather than acting as a bellwether device it has provided an inverse signal…UM continues to improve and prosper even with the slippage in the ranking. Given this inverse effect, if for no other reason, one hopes that it continues to undermine the publications reputation. </p>
<p>The best strategy would be for a random selection of 25 of the top 50 schools each year to refuse to submit data. USN&WR report would move all of those rankings into the cellar as they did with, I believe, Reed. The next year, another randomly selected cohort would follow the same process. If the top 50 schools did this, the volatility induced in the rankings would reduce them to complete chaos/noise and terminally impact their validity. This would take 3-5 cycles and the schools would have to be disciplined and cohesive, but this would be a good project for the AAU to break the USN stranglehold on a job which they do so poorly.</p>
<p>One of the most amusing aspects of the USN&WR process: they setup a calculated measure to predict things like how many students should graduate. Schools are then rewarded or punished if they exceed the model prediction. In my world, that means that they are, effectively, assigning their regression error to the schools. So, rather than saying: “our regression sucks and schools can’t be reasonably predicted to achieve a certain result” they say "we have a perfect model and since there is no noise, we will convert that noise into a ranking bump up or down. They have turned their calculational incompetence into part of their product. Clearly, it is great for the schools which outperform (though the causation of that outperformance is, perforce, a mystery) but what about the schools which are penalized? Their reputation now hangs on the ability of USN&WR to produce a good model. </p>
<p>A second amusing part of the process: the use of 4 year graduation rates. If a student is poor, they may take longer to graduate. It is easy to imagine a kid with a 4.0 average taking 6 years or more to graduate if he can’t afford to compress the credits into a smaller time frame. Likewise, state institutions which have had funding cut may still offer a high quality product, but not offer as many sections of required courses each year. Again, no matter how good the school, under such a regime, kids will take longer to graduate, but that fact will say nothing about either the student or the school. In sume, this measure is almost purely a wealth measure: wealthier schools will offer scholarships and open more sections. So rather than measuring academic merit, this is purely a measure of wealth and good money management. Clearly, good schools produce successful graduates and are also well managed on the money side, but rewarding such schools with academic accolades is a bit intellecually perverse.</p>
<p>“A second amusing part of the process: the use of 4 year graduation rates.”</p>
<p>Also universities with large engineering colleges/departments usually fare worse in four year graduation rates than those who either don’t have them or are relatively small in comparison. Engineering degrees typically take a bit more time than four years to complete. </p>
<p>If you look at your data, UCLA beats Michigan in the categories that are weighted significantly higher.</p>
<p>UCLA beats Michigan in the first (30%), fourth (15%), and sixth (35%) categories whereas Michigan beats UCLA in the second (10%) third (5%) and fifth (5%) categories. Given how the former categories are weighted significantly higher, that might explain the differences between the two universities’ placements.</p>
<p>
</p>
<p>How much of an edge is that really though? I thought I remember someone (perhaps Phantasmagoric) saying that although Harvard has an endowment of more than 30 billion, most of that money is restricted, so Harvard can’t use it anyway. Is that similarly the case with Michigan’s endowment?</p>
<p>As a side note, does anyone know how much money Michigan raises annually from OOS / international tuition? </p>
<p>I think UCLA raised an average of about $110m per year from OOS tuition, and I believe this number is increasing.</p>
<p>1) “How much of an edge is that really though? I thought I remember someone (perhaps Phantasmagoric) saying that although Harvard has an endowment of more than 30 billion, most of that money is restricted, so Harvard can’t use it anyway. Is that similarly the case with Michigan’s endowment?”</p>
<p>Almost all endowments are derived from donations which are restricted as to possible domain for expenditure. This is just as true at UCLA and Michigan as it is at Harvard. All universities must deal with restrictions. However, there is a substitution principle: money becomes fungible in that expenditures from endowment for activity A means that the money which would have otherwise been spent on Activity A can be budgeted for activity B. So universities are constrained by donors, but as a practical matter there is a degree of fungibility.</p>
<p>2) “As a side note, does anyone know how much money Michigan raises annually from OOS / international tuition?” </p>
<p>In state tuition is around $13,000 and out of state is around $43,000, so with a margin of $30,000<em>28000</em>.42=$353,000,000 above what would have been paid by in-state students is harvested from OOS/Int. relative to in-state. But that is extremely misleading in that even OOS/Int students don’t pay the full freight: both groups are still getting the benefit of somewhat higher spending. And the margin earned from OOS/Int isn’t really margin. The proper view is that the students paying $13,000 are receiving a huge subsidy from UM…$30,000 per student…the OOS/Int margin helps to buffer that loss.</p>
<p>3) “I think UCLA raised an average of about $110m per year from OOS tuition, and I believe this number is increasing.” </p>
<p>For the reasons stated above, this is probably not accurate. UCLA is probably harvesting some money, but probably less than what they spend and probably more than fully offset by whatever in-state subsidy they give to instate students.
For all of this stuff, there is a certain money illusion: these entities are not for profit so they are constantly shuttling money between pockets to fund their budget. The actual accounting is slightly opaque once you have mixed in money from research (including cost recovery) and grants.</p>
No, UCLA did not beat Michigan in faculty salaries. US News adjusts for region and cost-of-living differences, whereas the numbers I reported did not. Are you sure LA is not 2-9% higher than Ann Arbor? How much difference is housing cost in LA compared to AA? How about transportation (e.g., parking, gas, insurance, etc.)? I’d say Michigan faculty salaries are even or slightly higher when adjusted.</p>
<p>Thus, in reality, UCLA beats Michigan by 3% in #1 (30% wt), and loses by 4% in #2 (10% wt) and by (15:1 to 17:1) in #3 (5% wt). And that translates to 36 places in ranking?</p>
<p>USNWR needs to add another category to its rankings: Private universities that accept state funding. Any public universities that enrolls more that 30% if its students from OOS and international students should be included in this new category. Any public U that enrolls > 50% of its students from OOS should now be considered a private university.</p>
<p>UMich has 8.4 billion endowment last year (9.47 billion as of March this year) and I believe reading somewhere that they use around 5% per year which is pretty much covered by the income of investment from the endowment. Also, if behphy’s number for UCLA OOS tuition income is correct, UMich is making 3 times that amount from the difference between in state and OOS tuition. And yes, that number is for sure going up as the increase rate for OOS tuition is again higher than that for in state.</p>
<p>“Alexandre, as I mentioned before, endowment is not a factor in the USNWR rankings…annual spending per student is.”</p>
<p>I get that UCB. That is why in post #104 of this thread, I said “if the USNWR audited data for accuracy and relevance, and actually rated universities properly for each of the criteria of its methodology, I think Michigan would be ranked between #10 and #17.” </p>
<p>The USNWR attempt at seeming scientific in its ranking is so laughable. The financial section has seen universities leap 10, 20, even 30 spots in a single year simply by tweaking their accounting figures. The only figure that the USNWR should look at to determine financial resources rankings is institutional wealth:</p>
<ol>
<li>Revenues from Endowment</li>
<li>Revenues raised through tuition</li>
<li>Revenues raised through donations</li>
<li>Revenues from state appropriations </li>
</ol>
<p>Naturally, the size of the university (including graduate programs), and the cost of operations should be factored in. Tiny colleges with not Engineering or Medical programs obviously require far less revenue than large research universities with Engineering and Medical programs. </p>
<p>As in the measure of each criterion, it is impossible to rank universities. I think assigning a rating is far more pragmatic.</p>