USNWR Peer Assessment Scores look biased to number of grad students

<p>Midatlmom,
I did not intend my earlier post to come across as a personal attack and I am sorry if you felt that that was the case. I was only trying to clearly lay out our differences and the reasoning behind my positions.</p>

<p>Re your latest post, I think you make a lot of good points, especially your concluding statement which I completely agree with. But I must still disagree on the various points about the nature and value of the PA. And I would not expect a separate PA ranking would provide any additional insight into the quality of teaching, but I do think that a ranking based on objective, quantifiable factors has far greater substance than one heavily determined by the subjectivity of a single class of stakeholders. </p>

<p>As for the threads that I have created and participated in about PA, teaching quality, or great schools based on the entire undergraduate experience, I am glad to have had the opportunity to participate in these discussions. My objective is to widen the universe and recognition of top schools to something more in line with how I believe corporate America sees undergraduate education in the USA. CC is so dominated by the same 10-12 great schools (mostly in the NE) and IMO this is both a disservice to the many other fine schools out there and the many outstanding students and families looking for excellent undergraduate alternatives. </p>

<p>Bluebayou,
You claim that the objective data create a pecking order, but that would only be true for a single year. It would create a ranking list, but it is a list that actually CAN change from year to year. It is not something that is perpetually the status quo. If a school works hard and is successful in greatly improving its scores (pick your category-SAT scores, graduation rates, etc), then this is visible and understood. Things can and do change at a college and their objective profile can improve significantly over relatively short periods of time, eg, USC. It is that type of transparency that I want to see and which clearly communicates how various colleges compare on a variety of standard benchmarks.</p>

<p>ok, the two categories I pick are Endowment and Alumni Giving. And, no, you can't use your claim that 'Alumni Giving must go'; it's *objective[/]....</p>

<p>Please essplain how those will change beyond a glacier pace.....</p>

<p>Xiggi,</p>

<p>If you are proposing to calculate a correlation from two data points, you should really take a course in statistics.</p>

<p>The assertion was that PA favored universities with large graduate student populations. I don't think Hawkette was naive enough to suggest that it would be impossible to find counter examples. The claim was that this described the pattern of PA scores. Finding patterns is a classic statistics question. Collegehelp provided a statistical answer. </p>

<p>Finding a pair of LAC's where the the relationship between SAT scores and PA is different than expected (but does correlate) is about as far off the point as anything I can imagine.</p>

<p>bluebayou,
Not sure what you are looking for, but accepting the idea that endowment (do you mean the Financial Resources rank because I think that is something different?) and alumni are legitimate factors to measure, then why not? If, over a few years, a school takes its alumni giving from 20% to 30% (which would result in jumping from about 62 to about 25 in the USNWR Alumni Giving ranks), then this is seen and understood by anyone. You may want to argue about whether this is a valid metric or if the way it is measured (participation rate vs $ dollars given) is the best method, but this is at least visible, objective and quantifiable. </p>

<p>The main difference between PA and these objective measures is that if you set the bar on the objective data, everyone can see it and schools can shoot to reach or exceed it. For PA, there is no bar and no matter what happens, it is unclear how a college goes about achieving a higher score. It is all left to the whims of academic administrators (who are not identified as having voted or not and how they voted).</p>

<p>
[quote]
For PA, there is no bar and no matter what happens, it is unclear how a college goes about achieving a higher score.

[/quote]
</p>

<p>This is one of many reasons it is pointless to worry about it. </p>

<p>If you ask a group of academics what they think of colleges, this is what you get. There is no "right" or "wrong" it is a set of opinions. It's a free country, everyone is entitled to disagree. </p>

<p>But it seems silly to debate whether they should be asked, who responds, and particularly whether university X should be higher or lower in academics' estimation. It is what it is. </p>

<p>No rational college administrator will waste time trying to improve their PA score. There would be far more payoff to recruiting some better faculty, bringing in more talented students, or anything else that would raise the academic accomplishments of the faculty and students. That, after all, is what the university is there for.</p>

<p>afan,
I agree with all you have said above in #46 and I think all of that reinforces the argument for not including PA scores as part of a ranking system.</p>

<p>Hakette:</p>

<p>So are you suggesting that a college should seek to increase its SAT scores (and, thus, its objective rank) even if it results in more low income kids being excluded? Are you suggesting that a college should cater itself to wealthy families who have the wherewithal to give back ("Alumni Giving")? Are you suggesting that colleges exclude Pell Grantees bcos they take 6+ years to graduate?</p>

<p>In essence, are you suggesting that colleges revert back to catering to the rich and famous?</p>

<p>
[quote]
Xiggi,</p>

<p>If you are proposing to calculate a correlation from two data points, you should really take a course in statistics.</p>

<p>The assertion was that PA favored universities with large graduate student populations. I don't think Hawkette was naive enough to suggest that it would be impossible to find counter examples. The claim was that this described the pattern of PA scores. Finding patterns is a classic statistics question. Collegehelp provided a statistical answer. </p>

<p>Finding a pair of LAC's where the the relationship between SAT scores and PA is different than expected (but does correlate) is about as far off the point as anything I can imagine.

[/quote]
</p>

<p>Thank you. I needed to fill a small opening in my schedule. I am so relieved.</p>

<p>bluebayou,
Lots of loaded (and false) questions. No one is saying anything of the sort and I am not into the whole class warfare thing. Each institution should make its own decisions based on what is right for it. If they employ practices that some believe unfairly punishes certain groups or favors other groups, then students (and others) will make their own judgments about these practices and what they mean to the quality and reputation of the institution and whether they want to be there or not. </p>

<p>As for the Alumni Giving charge of income discrimination, I don't see how asking an alum to contribute $1 is such a burden, even if the person is low income. In all likelihood, the institution dedicated enormous resources to the development of that student and asking the student to make an annual $1 or $5 or $25 donation does not seem out of line to me. Now whether that should be a factor in the USNWR rankings....that is another issue.</p>

<p>Looks like nobody else got your so-called point either.</p>

<p>OK, I think colleges should encourage students to explore academic options. I think that colleges that force students to apply to a particular school, or even worse, a particular major, do their students an important disservice. Not everyone agrees with me on this point. Some people don't care. If I think it is important, then I would find it hard to accept a "ranking" that failed to take this into account. Someone who does not care would find this completely unimportant, and would think it distorting to include this in a ranking. There is no way to quantify flexibility of major choice, but it is easy to describe.</p>

<p>Now how could one create a ranking that is correct for someone who thinks this counts and for someone who does not?</p>

<p>There are so many things going on for students at college that the notion of reducing the experience to a single number is just silly. So USNews fails before it starts. </p>

<p>The problem is not that the factors or the weights used by USNews are wrong, although people love to argue about them. The problem is that you cannot produce a single ranking formula that reflects the wide range of opinions of what counts.</p>

<p>Overrated by USNews. </p>

<p>This appears to reflect the wealth of the student body, the wealth of the graduates, and the effort the college puts into getting at least minimal contributions from as many people as possible. This really tells nothing about how much the students got out of their college experience.</p>

<p>It is easy to waste development funds on collecting tiny contributions. If any colleges really do try to get $1 donations from most of their alums so they can report a high giving rate to USNews, they should stop. They would be better off devoting those resources to getting substantial donations from people who can give a meaningful amount of money. My alma mater contacts me many times each year asking for money. This in spite of the fact that I give them money every year, so all the mail and phone calls would be better spent courting someone with a lot more money. In any case, my lifetime total is trivial in the context of this institution's resources and needs. I wonder whether it does this so that my insignificant contribution can be listed as one more alum donation, not because it is worth the effort for them to cash the check. I hope they have a better reason for this behavior than trying to impress USNews.</p>

<p>The vast bulk of fundraising dollars come from a few big donors, not from the rest of us who give more for the principle of the thing than because we thing the money actually matters.</p>

<p>So the alumni giving rate means nothing to me. It is not evil, misleading, or elitist. It just does not matter.</p>

<p>
[quote]
On a more serious note, would you care to comment on the idea of providing rankings for PA separately from the objective data?

[/quote]
</p>

<p>The idea may have some merit--but what concerns me are two things. </p>

<p>One, the implication that a ranking on "objective" data is somehow objective. As we've discussed ad nauseum, the inclusion of, measurement criteria for, and weighting of these variables is also subjective, and that needs to be recognized. </p>

<p>Two, the absence of any measurement of that elusive "reputation" factor--A university or college may be greater (or lesser!) than the sum of its parts, and while that's hard to capture I think it's worth trying. It doesn't have to be a peer ranking, or a peer ranking in its current form--but some kind of measure of how a school is "thought of" ought to be available. This may be more important for schools fewer people have heard of, than for the ones at the top...but that's another issue and I'm digressing.</p>

<p>Rather than what you've proposed, I would prefer to see is a system where students could see how schools rank on each of the criteria. In other words, what you're proposing for PA I would like to see for all the measures. I think Washington Monthly does this is their ranking.</p>

<p>USnews already provides the PA data for each school. by clicking on the PA column you can generate a PA ranking. So the only difference would be creating a ranking without PA. This requires copying the data and backing the PA out of the formula.</p>

<p>Given all the problems with USNews methods, why bother?</p>

<p>hoedown,
Thanks for your comments. I am with you on the weighting aspects of various factors. There are some absolutes that could be guiding principles, but how much and how universally to apply are extremely difficult at best. As you know, USNWR does allow the ranking of the various colleges on the individual ranking components so one can see how a school performs in Graduation & Retention or Selectivity or some other factor. I think we both would prefer that USNWR provide a customizable model that the individual could weight for their own circumstances, but that would likely sell a lot fewer magazines. </p>

<p>On reputation, I agree that it is important, but I do think that a college's reputation should be considered among many stakeholder groups. I know that employers have insightful views into many colleges at which they hire regularly and, if I were an aspiring college student, this is one opinion that I would be most interested to hear. Same with students who could relay their sentiments on whether they felt that a school's undergraduate experience matched its reputation.</p>

<p>
[quote]
I know that employers have insightful views into many colleges at which they hire regularly and, if I were an aspiring college student, this is one opinion that I would be most interested to hear.

[/quote]
</p>

<p>I think their views might be pretty limited, actually. I'd like to hear more about what you're proposing--how you think this would differentiate between institutions. </p>

<p>What I think would happen is that you'd have little differentiation. Employers would each say that the grads they hire from School X seem pretty well-prepared (elsewise, why would they recruit there). Your hopes about lesser-known schools "finally" getting the recognition they deserve wouldn't be realized. The people who hire at Harvard will give Harvard grads high marks, and the companies which hire at Boise State are going to give Boise State high marks, and you'll be left with a measure that doesn't add to the ranking. </p>

<p>Sure, it might be comforting for students to know that grads like to hire from their perspective school and that they find workers well-prepared. But no one who is specifically looking for variables on which to slice and dice schools into tiers is going to want to include a measure that doesn't differentiate much.</p>

<p>Maybe I'm guessing wrong. How have you been figuring the survey would work?</p>

<p>hoedown,
Probably one of the greatest problems in college admissions today is the seemingly overwhelming preoccupation with a concentrated group of excellent colleges and the stress and the frenzy that surrounds their annual undergraduate admissions decisions. IMO this unfortunately creates an incorrect impression among applicants that they absolutely, positively must go to one of these “elites” or their futures will be bleak. While elite colleges can certainly deliver a wonderful undergraduate education, they are not alone in this and they certainly do not have a monopoly on the brains and the talents of the students that enter the work force every year. It is my belief that an employer survey would help students better understand that attending ABC elite university is not the key to life or success and that one can certainly have a fabulous experience and life coming from XYZ less heralded college. I would guess that you and I and many others (particularly parents) see this as pretty much common sense, but I think we also see many, many examples of students (and families) whose attitudes personify this preoccupation with acceptance to the elite colleges. </p>

<p>I have posted elsewhere something of an initial approach to an employer survey involving sampling of top employers from a wide variety of cities, regions and industries. Clearly, there are a number of large companies that recruit nationally and they could provide better perspectives on the quality of students coming from different regions. It is my strong expectation that such a survey would show that companies don’t differentiate nearly as much for UNDERGRADUATE hiring as the hype would have us believe. Furthermore, I suspect that such a survey would show that only a handful of colleges have true national recruiting appeal and then the regional private powers and state universities play a much stronger role. I would include Wall Street in my comments, despite much commentary here on CC to the contrary. Wall Street (and maybe consulting) is probably more brand conscious than nearly any other industry, but even there I believe that there is a broader hiring universe than typically believed and this is especially so for undergraduate hiring. </p>

<p>As for your comments about differentiation, I would envision the results being delivered both nationally and regionally. You may be right if the results of a survey are put into a national context as only a handful of schools would dominate and then the other colleges measure out somewhat more evenly. However, in a regional context, I would expect that the results would be much more telling and differentiated, even including those nationally recruiting companies. For example, Georgia Tech is a wonderful engineering school with an excellent reputation in several disciplines, eg, civil engineering. Likewise, colleges like UC Berkeley, U Texas, U Illinois and Carnegie Mellon have outstanding reputations and graduates. While all of their graduates would likely have some level of brand power to allow them to interview all over the country, the reality is that most of the Georgia Tech folks end up in the Southeast, the UCB folks end up in the West, the U Texas folks end up in the Southwest, the U Illinois folks end up in the Midwest and the Carnegie Mellon folks in the Northeast and/or Midwest. Employers, national and local, understand this and would likely rank their “regional” college ahead of the others because of their familiarity and good experience with these schools. The same pattern is repeated for company after company, region by region, for colleges all across the country. Unfortunately, this is not well appreciated by many aspiring college students and thus their lack of perspective inspires the furious efforts to gain admittance to a relative handful of colleges.</p>

<p>
[quote]
Employers, national and local, understand this and would likely rank their “regional” college ahead of the others because of their familiarity and good experience with these schools.

[/quote]
</p>

<p>This is problematic, to me. Unless I am misunderstanding you, you're saying that employers will give some schools higher marks than other institutions simply because they don't know much about those others and don't recruit there.</p>

<p>hoedown,
Ideally, responders would only comment on and grade colleges with which they are familiar and have some true and direct knowledge rather than just general heresay. Will this happen? Who knows? Does this same concern arise with today's PA scoring? Yes. </p>

<p>I recently found a research paper by Alexander Hicks, Professor of Sociology, at Emory University, entitled "Graduate School and College Excellence: Does research reputation influence undergraduate rankings?"</p>

<p>The author performed some statistical analysis and here is a key quote from his findings:</p>

<p>"It appears that when academics at National Research Universities judge the merits of colleges at other NRUs they think of them, in effect, in terms of impressions of those schools’ graduate and research excellence and student selectivity, and not much else."</p>

<p>Following is the link to the paper:
<a href="http://www.emory.edu/ACAD_EXCHANGE/2005/octnov/hicks.html%5B/url%5D"&gt;http://www.emory.edu/ACAD_EXCHANGE/2005/octnov/hicks.html&lt;/a&gt;&lt;/p>

<p>As that's pretty much sums up how major universities judge themselves it seems perfectly sensible. All else flows from the academic respect of the faculty and students with some minor exceptions.</p>