Are public universities hurt or helped by USNWR methodology?

<p>barrons,
Certainly a very impressive number of awards for that professor (although it didn’t appear as if any of these were for her work in the classroom). How many classes did she teach last semester? How many undergraduate students are in each of her classes? How many letters of recommendation did she write last year to grad schools? Also, why should her work and awards matter to someone who majors in history? </p>

<p>jags861,
I’m not sure why you would consider giving voice to those other groups-students, alumni, employers-would turn the Faculty Assessment into “bunk.” If anything, their added comment would either reconfirm the opinions of academics or open up the discussion in a new direction. Either way, it makes the faculty more accountable to students and I see that as a VERY good thing. Again, I think schools exist for the students, not for the faculty. </p>

<p>I think your use of the Harvard/Hawaii example subverts your argument. Let’s make this a little more real. Let’s compare U Maryland and U Miami. Both are ranked #54 by USNWR. U Maryland’s PA is 3.7 and U Miami is 3.2. On its face, this may not be a big deal but a move up by U Miami or down by U Maryland in its PA score could very easily move its ranking a few slots. I don’t know, but there may be some cross-admit battles going on. Is there an incentive for either school to mark the other up or down in the PA survey? I would hope that the vast majority of academics would give a truthful (and accurate) response, but I’m not so na</p>

<p>
[quote]
From an employer's perspective (and particularly for an employer outside of the frequently referenced investment banking/management consulting fields), the institution is FAR less important than the student.

[/quote]
</p>

<p>So where is the harm? If employers (rightly) believe that other schools are high-quality, and believe that those schools attract (and graduate) high-quality students, then employers are welcome to recruit there! They can and do. </p>

<p>
[quote]
As for the bones that I pick with USNWR and its methodology, a few items are mostly responsible for the perpetuation of the rankings order...Secondarily are the weights accorded items like alumni giving, 6-year graduation rates and financial resources (which some interpret to mean endowment size).

[/quote]
</p>

<p>It seems to me that in this thread you are holding institutions (or, at least, this "educational elite" you name) accountable for something that is the doing of USNews. You also appear to be holding them accountable for the fact that some people don't read the paragraphs on methodology. </p>

<p>I can't speak for the 'educational elite' (whoever they are) but I know that many people in higher education are in favor of the rankings being more transparent. Yes, that's right, they'd prefer more, rather than less, transparency. From all that I have seen, the relationship between higher ed and USNews borders on adversarial, rather than conspiratorial.</p>

<p>hoedown,
The major (and almost only) USNWR number that is not transparent is the Peer Assessment. The educational status quo and many state universities (including you and U Michigan) comparatively benefit from this number. Is this an accurate number and is it reflective of colleges today and would students, alumni, and employers agree with its conclusions? Possibly, but it is certainly in the interests of the status quo to perpetuate this number (even while making noises that they dislike it).</p>

<p>
[quote]
The educational status quo and many state universities (including you and U Michigan) comparatively benefit from this number.

[/quote]
</p>

<p>You've made this statement many times and in other threads, but I think it's the first time you've told me I personal benefit from it! If I only I did.</p>

<p>
[quote]
Is this an accurate number and is it reflective of colleges today and would students, alumni, and employers agree with its conclusions? Possibly, but it is certainly in the interests of the status quo to perpetuate this number (even while making noises that they dislike it).

[/quote]
</p>

<p>You can continue to lay blame at the feet of this mysterious, amorphus guild of the educational elite, but I still don't understand the evidence or rationale that they are behind it, or at fault for its presence in USNews' ranking scheme. </p>

<p>Maybe the reason USNews uses a "reputation" measure is that it, like measures of selectivity, has so much face validity with the people who fork over $9.95 to own USNews' ranking. Whether or not you or I or the "educational elite" feel that it is truly valid measure is a separate issue.</p>

<p>The argument I am hearing is: "Peer Assessment is a bad measure. Peer Assessment benefits the ranking of public flagships and elite colleges. Thus, public flagships and elite colleges must be responsible for its inclusion by US News." Do I have this wrong?</p>

<p>
[quote]
The argument I am hearing is: "Peer Assessment is a bad measure. Peer Assessment benefits the ranking of public flagships and elite colleges. Thus, public flagships and elite colleges must be responsible for its inclusion by US News." Do I have this wrong?

[/quote]

I too am questioning the validity (and the purpose) of the PA, but I would not even dare to draw the conclusion that "public flagships and elite colleges must be responsible for its inclusion".</p>

<p>Hawk--the Dreyfus award stresses teaching and working with undergrads.</p>

<p><a href="http://www.dreyfus.org/th.shtml%5B/url%5D"&gt;http://www.dreyfus.org/th.shtml&lt;/a&gt;&lt;/p>

<p>This Spring she taught an UG seminar in Biochem and Chemical Biology which is an advanced class for UG and grad students. I have no idea about letters etc. Why would it matter to a history majow That has to be a joke comment. Why would a seminar in French Labor History matter to a biochem major? It's a big school with 175 majors. The history dept has its own stars. Several from my day have had their lectures made available for future students.</p>

<p><a href="http://history.wisc.edu/goldberg/goldberg.htm%5B/url%5D"&gt;http://history.wisc.edu/goldberg/goldberg.htm&lt;/a&gt;&lt;/p>

<p><a href="http://mosseprogram.wisc.edu/index.html%5B/url%5D"&gt;http://mosseprogram.wisc.edu/index.html&lt;/a&gt;&lt;/p>

<p><a href="http://www.creeca.wisc.edu/petrovich/index.html%5B/url%5D"&gt;http://www.creeca.wisc.edu/petrovich/index.html&lt;/a&gt;&lt;/p>

<p>Biochem Seminar had 15 UG students. Chem Bio had 24 of which 3 were UGs.</p>

<p>hoedown,
You ask how I conclude that state universities benefit from the PA scoring. </p>

<p>The following numbers reflect my thinking as I see a disconnect between the PAs of State Universities and other schools that are ranked close to them. Is the quality of the teaching at these institutions so different as the PA might suggest? Do the undergraduate students really experience such a difference in faculty quality as these numbers would suggest? I don’t think so nor do I think there are large differences in the final product (the graduates) that come out of these schools. </p>

<p>Consider the following groups of similarly ranked schools and their PA scores:</p>

<h1>20 Notre Dame (3.9)</h1>

<h1>21 UC Berkeley (4.7)</h1>

<h1>22 Carnegie Mellon (4.2)</h1>

<h1>22 Georgetown (4.1)</h1>

<h1>24 U Michigan (4.5)</h1>

<h1>24 U Virginia (4.3)</h1>

<h1>26 UCLA (4.3)</h1>

<h1>27 USC (3.9)</h1>

<h1>27 Tufts (3.9)</h1>

<h1>27 U North Carolina (4.2)</h1>

<h1>30 Wake Forest (3.5)</h1>

<p>Or even more dramatic is the following:</p>

<h1>31 Brandeis (3.6)</h1>

<h1>33 Lehigh (3.2)</h1>

<h1>34 Boston College (3.6)</h1>

<h1>34 NYU (3.8)</h1>

<h1>34 U Rochester (3.4)</h1>

<h1>34 U Wisconsin (4.2)</h1>

<p>In every group, the state university is helped by its PA score (and sometimes by a lot) which boosts its comparative rank. Ex-PA, all of the state universities (except for U Virginia and non-research oriented W&M which gets no respect and has a PA of only 3.8) would fall in rank considerably. The biggest loser of all might be your U Michigan which would fall 10 ranking spots to #34 in an ex-PA ranking list. I’m delighted that I am able to be the one to inform you that you benefit from this. :)</p>

<p>This PA effect continues on down the rankings and, based on this, it is pretty clear that State Universities get a lot of benefit from the high absolute level of their PA scores and the high 25% weight that USNWR allots to this measure. </p>

<p>Re your characterization of my thoughts on PA, let me try to state it more clearly and accurately: “PA is a bad measure. We don’t know what it measures, who is doing the measuring, how they are measuring, or what their individual conclusions are. It is a completely non-transparent number.” STOP. That’s my argument although I also conclude that state universities are large beneficiaries of the current PA system as performed by USNWR.</p>

<p>
[quote]
hoedown,
You ask how I conclude that state universities benefit from the PA scoring.

[/quote]
</p>

<p>No, I didn't ask. You've covered this extensively in several threads. I don't have any questions about why you believe it helps public institutions. Nor is it news that me that it helps the University of Michigan. I was just grinning at your assertion that is not only helps U-M, but me, personally, as well (see post #103). I wish!</p>

<p>You could have saved a lot of typing by simply pointing out the difference between the 1987 ranking for Michigan and the 2007 ranking for Michigan. If you think I've argued that U-M doesn't have a good peer ranking, or that U-M doesn't benefit from it when it is included, you're confused me with another poster.</p>

<p>
[quote]
"PA is a bad measure. We don’t know what it measures, who is doing the measuring, how they are measuring, or what their individual conclusions are. It is a completely non-transparent number.” STOP.

[/quote]
</p>

<p>I think you've suggested much more than this.</p>

<p>Yes the other schools are lucky to be ranked anywhere near Wisconsin.</p>

<p>barrons,
Thanks for doing the research. She sounds like a great professor.</p>

<p>Re the history major comment, this relates back to jags861 who stated that he/she is a history major. Part of my frustration is that while professors like Kiessling are probably good to have around for those students in the Chemistry/Biology program, a student majoring in history gets no benefit from this. Yet the entire school’s reputation is likely heavily impacted by the research successes in fields that have nothing to do with history (although in truth we’ll never know because we have no idea who gave U Wisconsin their strong PA score and what they were basing this on). So any student trying to decide between U Wisconsin (4.2 PA) and U Rochester (3.4 PA) and who might be interested in history would get mistaken guidance from a measure like PA. </p>

<p>As for your links to the dead or long-retired history professors, one could argue why even bother going to Madison to hear/read these lectures. With technology what it is today, a student would have the same access in NYC, Des Moines, Lyon, Lima, Perth, Bangalore, etc.</p>

<p>Re your ranking comment directly above, I like your sense of humor. I suspect that students, graduates, and others affiliated with those other schools might interpret the result a little differently. :) :)</p>

<p>
[quote]
Yes the other schools are lucky to be ranked anywhere near Wisconsin.

[/quote]
</p>

<p>I consider us lucky to even share a portion of the country with you. Thank you. ;)</p>

<p>What I was showing was that History at UW has a LONG tradition of excellence going back to Frederick Jackson Turner--one of the first modern historians. Today's department is just as good but I don't know the names.</p>

<p><a href="http://history.wisc.edu/alumniandfriends/newsletters/history_newsletter2006.pdf%5B/url%5D"&gt;http://history.wisc.edu/alumniandfriends/newsletters/history_newsletter2006.pdf&lt;/a&gt;&lt;/p>

<p><a href="http://en.wikipedia.org/wiki/Frederick_Jackson_Turner%5B/url%5D"&gt;http://en.wikipedia.org/wiki/Frederick_Jackson_Turner&lt;/a&gt;&lt;/p>

<p>barrons,</p>

<p>I don't think that these type of ratings are bunk, I believe compiling them somehow, and then sticking them into an already complex ranking would be bunk.</p>

<p>If people like hawkette think that PA is not reliable because people can't be knowledgable about every school, then the same can be said of every poll. Now, the reason why I believe PA is the best is because university presidents are likely to know a significantly more schools that employers.</p>

<p>How can employees give an accurate representation of schools? What corporations would you use? Top corporations only know of the top schools. I doubt very much that Blackstone is recruiting from The University of Tulsa.</p>

<p>How can you do a faculty assessment that doesn't involve awards the faculty has received, or the faculties credentials? But as hawkette as stated, its not awards a faculty has received (most are research oriented) its the quality of the professor in the classroom. Unless a significant group of people personally knows a significant portion of a faculty, how can some outside source rate the quality of their in class teaching ability. Aside from that, most students or alumni only experienced their specific school, and therefore have nothing to compare it to.</p>

<p>Even grad school admissions--take the WSJ feeder poll--have you actually looked at how thats come up? Number of students going to the top 5 programs...and its all like 2%, 2.1%, 2.15%, etc. I mean, I don't know how you can judge a school based on what 2% or 3% of the class. When you have to split the ranking of school based on whether or not 2.1% of student went somewhere or 2.15% of students went somewhere, then thats a problem, imo.</p>

<p>Now if you were to come up with all these type of ratings, and then add them to an already complex formula that USnews uses, I think you would find that it would indeed be "bunk."</p>

<p>hawkette,</p>

<p>while most of what i said above applies to you as well, I believe that my hawaii/harvard example is fine. what do the faculty at harvard know about the faculty at hawaii? what do employers in new york know about hawaii? What do employers anywhere really know about the university of hawaii except for hawaii and california probably.</p>

<p>That said, your argument about maryland and miami competing against each other...i think thats ridiculous. the university of miami has a 3.2 because nearly all schools gave it a rating of 3, and a few gave it 4. why would maryland be vindictive and say "i'm going to agree with the majority of other schools and say miami is a 3, not a 4." Also, once again, while you bring up that PA has helped many public schools in the rankings, I could very easily say including alumni giving, or graduation/retention rate hurts public schools--2 ratings subjectively chosen to include in the ranking of the school. I've already given my opinion on why graduation and retention rankings are unimportant. Alumni giving rates also hurt public schools...by virtue of paying taxes you're giving...but that doesn't get included does it?</p>

<p>jags861,
I'm with you on the Alumni Giving number and to a lesser extent on the graduation numbers. Personally, I think 4-year graduation rates would be a useful addition to the analysis, but I think that the weighting given to the Graduation/Retention Rank is too high at 20% (4% freshman retention and 16% 6-year grad rate). </p>

<p>Re employers, I think that they know the quality of the schools in their region and, to a lesser degree, the quality of SOME schools around the country. I agree that getting a quality response from them would be difficult, but probably no less so than trying to get academics from over 1300 schools to reply (USNWR now has about a 50% response rate). </p>

<p>IMO, the perspective of employers is actually where the rubber really does hit the road. They have to get it right in their hiring decisions or their businesses will suffer. They know where the smart kids are and which schools do a better job of preparing the students and they also know the reputations of schools for producing team players or prima donnas. </p>

<p>Thinking completely off the top of my head, why not ask employers in a variety of geographic locales how they view a variety of schools. I am pretty confident that their responses would show a local bias. The effect of this would be good as it would finally make the point that neither Suzy nor Johnny have to go to the Ivy League or some other “elite” school in order to get a good job and that there are plenty of good schools that can prepare students for postgraduate life. </p>

<p>So start with the big cities all over the country and ask the five or ten or twenty or fifty biggest employers how they would rank the schools at which they recruit. Try this in Boston, Hartford, NYC, Philly, Baltimore, Washington, Richmond, Raleigh Durham, Charlotte, Atlanta, Orlando, Tampa, Miami, Nashville, New Orleans, Memphis, Little Rock, Pittsburgh, Buffalo, Syracuse, Rochester, Cleveland, Columbus, Cincinnati, Louisville, Detroit, Indianapolis, Chicago, Milwaukee, Minneapolis, St. Louis, Kansas City, Tulsa, Omaha, Oklahoma City, Dallas, Houston, San Antonio, Denver, Phoenix, Las Vegas, Salt Lake City, Seattle, Portland, Sacramento, San Francisco, San Jose, Oakland, Los Angeles, San Diego and other cities I am sure I have missed. Heck, maybe even add in an international component if you like. </p>

<p>Do this kind of a survey and tell the results (maybe break them out by city and/or region) to the public and the madness that now surrounds this college admissions process will subside quickly. People will get the message that employers recognize that there are a LOT of great schools around the country and that you don’t need to attend the USNWR Top 20 in order to be considered talented and more than capable of performing in a job. </p>

<p>Of course, this will probably never happen for a lot of reasons such as the impact to magazine sales could be negatively impacted by a result that showed that the frenzy over college admissions and college rankings is mostly not worth the trouble. But I do think that the results of an employer survey would be a heckuva lot more accurate and a lot more relevant than the current PA and would tell the students a lot more about how the real world views the schools they are considering or attending.</p>

<p>
[quote]
and the madness that now surrounds this college admissions process will subside quickly..... People will get the message that employers recognize that there are a LOT of great schools around the country

[/quote]
</p>

<p>But you know, although I do fret over the angst we see on College Confidental, I think we're getting a narrow view of college admissions. So many students--millions of students--don't really give a hoot about the Top 20. They don't apply to them or attend them. For whatever reason, they've already accepted that you don't need to go to a top school to succeed in life.</p>

<p>
[quote]
Very true, but comparing an ED school to an EA school is like comparing an apple to an oramge. Take Princeton, for example. Since the people that applied ED (the people for whom Princeton is their first choice by far) are barred from applying anywhere else RD (unlike those at other [EA] schools), there is no way they can be cross-admits so they are not included in the revealed preference ranking. Many of HYSM's EA admits, however, do apply RD to other schools (maybe for financial resons, maybe just out of curiosity)but end up going for their original first choice in the end. By doing that they boost their school's standing in the "battle for cross admits." That is why Princeton, and Brown for that matter, are at a disadvantage on your chart.

[/quote]
</p>

<p>Wrong. Absolutely wrong. And this goes not just a number of people who have commented on the RP study.</p>

<p>The RP study is NOT a study about cross-admits. I repeat, it is NOT a study about cross-admits. Rather, it is a MODEL that assesses what people WOULD have preferred if given a hypothetical choice. We can quibble about just how valid the model is, but I would say that the paper uses modeling techniques that are widely accepted within academia and that similar techniques have been used to model a wide range of customer preferences throughout various industries. </p>

<p>In other words, the paper is NOT commenting on cross-admits. If it were, it would not be an interesting paper in the least, as all it would be doing is reporting on statistics. Anybody can do that, given the data, hence the paper would not be interesting. What makes the paper interesting and novel from an academic standpoint is that it attempts to model customer behavior PRECISELY where the data happens to be missing in order to INFER what the missing data PROBABLY is, using known and accepted social science techniques. THAT is the value-add of the paper. </p>

<p>Hence, the difficulties of comparing ED vs. EA vs. RD schools * doesn't detract from the paper *. In fact, those difficulties are * precisely *what makes the paper interesting, because the model serves to specifically address the point of what happens with the missing data (of all kinds, not just ED/EA/RD data). No other study, to my knowledge, attempts to do this. That's why the paper is important. In other words, for various reasons, not everybody applies to schools X and Y. The paper models what people * would have preferred * if they actually had applied and gotten into both X and Y.</p>

<p>Look, here's the paper. I beseech all of you - * please read the paper before you comment on it*. I don't think the paper is that complicated to understand, and if you get caught up in the technicalities (especially the modeling parameters), I and others are here to help you. But please read the paper and understand what it is saying before you comment on it. The notion that missing cross-admit data would call the paper's findings into question is not only wrong, it is diametrically wrong - it is that missing data that is * precisely * what makes the paper important. </p>

<p><a href="http://www.economics.harvard.edu/faculty/hoxby/papers/revealedprefranking.pdf%5B/url%5D"&gt;http://www.economics.harvard.edu/faculty/hoxby/papers/revealedprefranking.pdf&lt;/a&gt;&lt;/p>

<p>joshua007,
Please explain to me what that 4.7 means and please provide evidence supporting your response.</p>

<p>UCB--Over 200 NAS members--about 10% of the faculty and does not include the Medical School members ay UCSF (at many schools a large number of their NAS members are medical). 45 -50 major faculty awards annually (Typically in top 3 in the US all schools). The excellence is obvious to most anyone.</p>

<p>
[quote]
Please explain to me what that 4.7 means and please provide evidence supporting your response.

[/quote]
</p>

<p>I know you're just poking joshua to make a point, but the first part of your question can be answered. </p>

<p>The 4.7 means that presidents and provosts who elected to answer Synovate's 2007 survey indicated that, on average, they considered Berkeley to be as "distinguished" as CalTech & Chicago, which is a little less "distinguished" than Princeton, Harvard, Yale, Stanford, and MIT, and a little more "distinguished" than Cornell, Columbia, and Johns Hopkins. It's not precise, to be sure, and we don't know the state of mind of the respondents. But what we do know is pretty simple--we know how Berekely was ranked on this five point, dual-anchored scale, and we know how Berkeley compares to the other national universities on this particular measure.</p>

<p>Mildly interesting that no institution got a 4.8 this year. UCB appears to be one of the institutions who "dropped" a tenth of a point, which as you may recall was the same thing that happened to many more schools than moved up. I'm not aware of how USNews weighed in on that.</p>