Part V: Ignore the Rankings and Focus on the Data (Alumni Giving)

<p>
[quote]
Those companies that have kept the best track of their previous customers over the years, communicated with them regularly, and make the greatest effort to solicit responses will naturally get the highest response rate and the highest number of positive responses.

[/quote]
Yes -- and most people would prefer to do business with a company that keeps track of their previous customers, communicates with them regularly, and makes an effort to solicit feedback. Would you rather do business with a company known for ignoring its customers after the sale?</p>

<p>
[quote]
And this is supposed to tell us something about customer satisfaction with the current product? I don't think so.

[/quote]
Past performance is not a guarantee of future results. But most people would regard a long, proven track record of superior performance as a positive indicator, no matter what the product is.</p>

<p>


</p>

<p>But the problem is we don't get a "long, proven track record of superior performance" from the US News data. We don't know from the total percentage of all living alumni giving whether customer satisfaction has improved, remained steady, or declined. We know nothing about what recent graduates think because those data are buried amidst data from people who graduated 10, 20, 30, 40, 50 years ago. Moreover, alumni giving is at least as much a reflection of the effort the school puts forth to induce alumni to give, as it is a reflection of actual satisfaction rates. </p>

<p>We have, in fact, a much more direct indicator of customer satisfaction: ask current and recent customers, those who are currently using the product or who have bought and used a similar and related model recently. That's what we do with cars and washing machines. No one would buy a new 2009 model car based on a survey that included answers like "Gee, I really loved my '57 Ford Thunderbird," nor would they buy a washing machine based on someone's belief that Maytag was once a great brand, if we have no way of separating out what current customers think of this year's model, and perhaps the last several years'. What customers thought of the product 10, 20, 30, 40, and 50 years ago is relevant only if we can compare those data to data on present-day customer satisfaction. But we don't get any of that from the alumni giving rate the way US News calculates it.</p>

<p>Alumni Giving should not be included in the rankings and has many lurking variables. Religious colleges, LACs and LAC-like environments, large research universities, public/private, etc. I think postgrad surveys should be used more as well as recruiter surveys.</p>

<p>bc,
Is it possible that you think that replacing Alumni Giving with some type of student opinion would be an improvement for USNWR's methodology? </p>

<p>Or maybe some other stakeholder who also has experience with the product of the university, namely the employer who judges whether the students graduating from XYZ College are better prepared than those from ABC University?</p>

<p>
[quote]
Or maybe some other stakeholder who also has experience with the product of the university, namely the employer who judges whether the students graduating from XYZ College are better prepared than those from ABC University?

[/quote]

Hawkette, while this data would surely be valuable, it is a matter of logistics collecting the data.</p>

<p>How big of a sample would be needed? How do you compare grads in different roles and capacities? How many managers would have experienced enough grads first-hand to have any sort of accurate assessment?</p>

<p>
[quote]
Is it possible that you think that replacing Alumni Giving with some type of student opinion would be an improvement for USNWR's methodology?

[/quote]
And would this be practical? Would USN&WR pay to survey a statistically significant percentage of students at every one of the hundreds of schools that it rates? Wouldn't this require tens of thousands -- maybe hundreds of thousands -- of responses? It would be very difficult and expensive -- and then the entire process would have to be repeated every 12 months. </p>

<p>And even if this was done, would students answer candidly and truthfully, knowing that their answers could affect the public ranking and prestige of the school issuing their degree? There would obviously be an incentive to exaggerate satisfaction in such a survey, and it would cost nothing to do so. With the giving rate, there's a firmer basis for assuming that positive responses are sincere, since it costs money to register approval.</p>

<p>corbett,
There is already the beginnings of such a survey being conducted. It called NSSE and has a pretty broad range of questions. The problem, of course, is comparing student opinions across different colleges (I would not envision students passing judgments on schools that they don't attend), but this lack of standardization exists in the current PA surveys as well and a lot of folks on CC seem willing to accept those views as gospel. </p>

<p>ucb,
I wonder how many schools each academic ranks when filling out the PA survey. If there is even a shred of integrity to these numbers, surely they're opining only on the schools that they have a great familiarity with. I would expect a similar approach to be taken by employers, eg, a Silicon Valley employer is unlikely to have a lot to say about William & Mary or a Wall Street employer is unlikely to have a lot of exposure to Texas A&M and so on.</p>

<p>Wall Street might not be in its current horrible condition if they had more TAMU grads with a lick of common sense than all the Ivy greedheads who are only out to make a faster buck than the next guy. And feel entitled to it.</p>

<p>
[quote]
But the problem is we don't get a "long, proven track record of superior performance" from the US News data.

[/quote]
The best-performing schools have alumni giving rates in the 50-60% range. That means that most of their living alumni write a check, each and every year. It's hard to see how this can happen, unless there are high giving rates across the board. Sure, there could theoretically be extremely high giving rates among alumni more than 25 years out, and extremely low giving rates among younger alumni. It just doesn't seem very likely.</p>

<p>
[quote]
What customers thought of the product 10, 20, 30, 40, and 50 years ago is relevant only if we can compare those data to data on present-day customer satisfaction.

[/quote]
US News could probably get colleges to provide giving rates for (say) the last 5 classes, if that would make you feel better. But I'd bet you that the same suspects would rise to the top. For example, [url=<a href="http://www.dartmouth.edu/%7Enews/releases/2008/06/05.html%5DDartmouth%5B/url"&gt;http://www.dartmouth.edu/~news/releases/2008/06/05.html]Dartmouth[/url&lt;/a&gt;] has had giving rates in excess of 70% for its last few classes, including a 92.5% rate for the Class of 2008.</p>

<p>
[quote]
There is already the beginnings of such a survey being conducted. It called NSSE and has a pretty broad range of questions.

[/quote]
But NSSE is useless for ranking purposes, for one simple reason: the results aren't available to the public (unless an individual school chooses to release them). If they were, the users at collegeconfidential would be all over them.</p>

<p>As [url=<a href="http://nsse.iub.edu/html/students_parents.cfm%5Dstated%5B/url"&gt;http://nsse.iub.edu/html/students_parents.cfm]stated[/url&lt;/a&gt;] at the NSSE website: "Our agreements with schools that participate in NSSE prevents us from reporting the results for individual colleges and universities"</p>

<p>So USN&WR would have to do its own survey instead. How much would this cost? Well, NSSE [url=<a href="http://nsse.iub.edu/html/pricing.cfm%5Dcharges%5B/url"&gt;http://nsse.iub.edu/html/pricing.cfm]charges[/url&lt;/a&gt;] between $3,375 and $7,500 per school, depending on enrollment. The USN&WR rankings cover some 2,000+ schools, so do the math. Then repeat in 12 months.</p>

<p>
[quote]
Or maybe some other stakeholder who also has experience with the product of the university, namely the employer who judges whether the students graduating from XYZ College are better prepared than those from ABC University?

[/quote]
Or namely, the professional school that reviews thousands of applicants from hundreds of different schools every year? How do these schools rate different undergraduate programs?</p>

<p>A possible model might be the Wall Street Journal's "feeder</a> school" ranking. And while perhaps it's just a coincidence, it appears that 6 of the top 10 schools on this list are among those rare institutions with 50%+ alumni giving rates.</p>

<p>"Sure, there could theoretically be extremely high giving rates among alumni more than 25 years out, and extremely low giving rates among younger alumni. It just doesn't seem very likely."</p>

<p>It's a fact. Take Princeton for example. First year alumni donated at a 75 percent rate when compared to the 60 percent overall. </p>

<p>Princeton</a> University - Annual Giving campaign raises record-breaking $54.1 million</p>

<p>
[quote]
There is already the beginnings of such a survey being conducted.

[/quote]
Why do you call this a beginning? Because of the number of schools who have participated? </p>

<p>NSSE isn't really asking student their opinions. It asks about their behaviors. That may be even more valuable, but it's different from a survey of student opinion.</p>

<p>
[quote]
Take Princeton for example. First year alumni donated at a 75 percent rate when compared to the 60 percent overall.

[/quote]
</p>

<p>Some schools--apparently Princeton along them--are very wise about making the importance of donations known to seniors and new grads. They're easily reachable and they're freshly nostalgic. Some institutions secure a lot of annual fund pledges before the class even graduates. It's a wonderful way to start alumni into the habit of making a regular gift to their alma mater. Consequently, the newest classes may have a very high giving rate, which may taper off a little as time progresses.</p>

<p>
[quote]
ucb,
I wonder how many schools each academic ranks when filling out the PA survey. If there is even a shred of integrity to these numbers, surely they're opining only on the schools that they have a great familiarity with. I would expect a similar approach to be taken by employers, eg, a Silicon Valley employer is unlikely to have a lot to say about William & Mary or a Wall Street employer is unlikely to have a lot of exposure to Texas A&M and so on.

[/quote]

But the PA is a survey of academics rating their industry. What industries/companies do you ask for managers who would have broad experience with graduates from a wide enough geographic base to minimize biases? </p>

<p>It's all based on opinion, Hawkette...I respect that you would value an employers view more than an academics...</p>

<p>Which employers would you include in your survey?</p>

<p>


</p>

<p>Believe me, presidents and provosts of colleges and universities pay very close attention to who's above them and who's below them, not only in silly things like US News rankings but in their real bread and butter: faculty recruitment, retention, and scholarly reputation, financial strength and stability, and student recruitment. They may not know every school intimately, but they have a pretty good idea where they stand in the pecking order, who's up and who's down. At the margins they have an incentive to rate themselves a little higher than others rate them, and to downgrade those immediately above and immediately below them. But since everyone has that same incentive, it all comes out in the wash. </p>

<p>Generally college and university presidents and provosts are going to be much more on top of this kind of information across a much broader range of academic disciplines and institutions than are employers who typically recruit only from a narrow band of disciplines, a relatively narrow geographic base, and/or a relatively short list of schools with which they're familiar. They're in competition only with other firms in their industry. For college and university presidents and provosts, higher education IS their industry, and to do their jobs well, they've got to know exactly where they stand vis-a-vis the competition. </p>

<p>Private and public sector employers need only to know where to look to get a reliable stream of qualified, trainable employees. In the most competitive industries, that means getting a steady stream of the best and the brightest. In some areas, the value added by the educational institution itself may not even be all that relevant; essentially, they're just using elite colleges as a sorting mechanism, providing a cheap (for them, not for the students), easy, and relatively reliable heuristic: If Duke, then high probability that candidate is really, really smart. If Ohio State, then somewhat lower probability that candidate is really, really smart. Independent investigation might reveal that the Ohio State candidate is, in fact, just as smart or even smarter than the Duke candidate, but since for many employers that's simply too much trouble to figure out, they'll stick with Duke and similarly selective schools. It may have little or nothing to do with the quality of the education actually delivered by the place; essentially, it's just a four-year, $200k+ validation stamp, certifying that you have high SAT scores and a high GPA---which of course you knew before you matriculated.</p>

<p>I don't know what the answers for how to get the student and/or employer information, but I do believe that their opinions have extremely high relevance to how schools are run. Taking their input makes the schools accountable to someone other than themselves and put the customer, not the institution, in charge of deciding the pecking order. As it stands now, the PA ranking process has a near-unchangeable pecking order that stands in stark contrast to nearly any segment of American society that I can think of. The only thing similar that I can think of is the dominance of the major broadcast networks, but even their monopoly has been eroded by technological changes, eg, cable, internet, etc. </p>

<p>bc,
I am not necessarily doubting that the college administrators know a lot about their closest competitors, but I do have problems with how this knowledge gets transferred into a score like the PA. I also have great respect for the views of students and families who are experiencing and paying for the product directly, for the alumni who have a longtime interest in the school and can put its brand value more in perspective as they age and for employers who must hire good people or risk seeing the value of their business falter or even fail.</p>

<p>How do presidents and provosts of colleges and universities determine the pecking order with regards to PA though? I might not be able to question the qualifications of these people but I can definitely question their methodology. It's so subjective and not based on objective data.</p>

<p>
[quote]
Taking their input makes the schools accountable to someone other than themselves and put the customer, not the institution, in charge of deciding the pecking order. As it stands now, the PA ranking process has a near-unchangeable pecking order ...

[/quote]
</p>

<p>If employers have strong opinions about how graduates are being prepared, they can--and do--enforce a pecking order all their own--by deciding where they recruit and who they hire. Same with graduate/professional schools and who they admit. It's not as if the world is held hostage by the PA survey. Nor is it the case that institutions are answerable "only to themselves" due to a lack of survey such as the one you propose.</p>

<p>
[quote]
How do presidents and provosts of colleges and universities determine the pecking order with regards to PA though? I might not be able to question the qualifications of these people but I can definitely question their methodology. It's so subjective and not based on objective data.

[/quote]
</p>

<p>What they know most about is faculty, and that's why PA tracks the NRC rankings of faculty quality so closely. Look, it's a highly competitive market for top faculty; in fact, it's a highly competitive market for quality faculty at every level, from top to bottom, and from entry-level to senior faculty. Presidents and provosts know who they're competing against, who they're consistently losing to, who they're consistently beating, who's taking rising young academic stars away from them, tho they're stealing rising young stars from, which faculties the best people in the business most want to be on, department by department. They track very closely how their own Chem E department stacks up against others in the field, whether it's a source of strength or a problem area for the school, who their closest competitors are, which Chem E departments are the envy of the entire field, and which are not strong enough to be a real threat. They know whether they've got shortcomings in their Philosophy Department, what those shortcoming are, how many additional hires it would take to correct them, where they'd go to look for the people they'd need to improve in that field---assuming it's a sufficient priority relative to all the competing demands on scarce university resources. If you're the provost at Michigan State, you know you're not likely to lure a top young philosopher away from Princeton, or from Michigan or UCLA; it would be just too crazy a career move for a scholar with professional ambitions, so you don't even try. But you do know that if there's a rising figure at Wayne State, you might have a good chance at landing her because it's a step up the career ladder for her to go to Michigan State. But you also know that for someone at Penn State it's basically a wash in terms of professional prestige, so you're only going to get that person if they have personal or family reasons to be in Michigan, or if you can make a financial offer that Penn State can't match---though of course, that will have ripple effects on your entire faculty pay structure, and you may not want to go there. That's half of it right there. Same as people in any industry judge excellence in professional and managerial personnel. "Subjective"? Well, it's a complex judgment about complex real-world facts that can't be neatly reduced to simple hard metrics, but top managers in all fields make these kinds of judgments all the time and we don't think twice about it.</p>

<p>They also know--because this stuff is all public, unlike most other industries--which faculties, and which particular members of those faculties, are the most productive scholars, and which are the most influential in their respective disciplines as measured by word-of-mouth reputation, observable impact on other scholarship, citation counts, and especially in the sciences and some engineering disciplines, ability to attract major research grants. Is this "subjective"? Not really; not any more so than any real-world calculation about complex facts, like who makes a good product and whose is junk, or who's a good manager and who's ineffective. </p>

<p>I've said this before: they know a lot less about the teaching side because it's not out there in the same visible and public way. But they do know some things, and they generally are in a better position than anyone else to know what's to be known. They know which parts of their own institution get the most student complaints, and sometimes which particular faculty members are problems, and which regularly win teaching awards. They know a lot of the same things about people they're trying to recruit, or considering recruiting, and over time that has a cumulative effect on their evaluations of their competitors' faculties. They certainly know objective data like student-faculty ratios, not only for their own institution but for their competitors, which in a certain sense is the entire industry. They also keep a pretty close watch on who they're losing prospective students to, and to the extent the information is available to them, why. So again, the provost at Michigan State knows her institution is really not very competitive against Michigan for highly qualified OOS students, but it can count on a getting a respectable fraction, though smaller than Michigan's, of the top in-state candidates. But on the other hand, other things equal Michigan State is going to be more attractive to more highly qualified students than the "directional Michigans" (Eastern, Western, Northern, Central) or Wayne State. "Subjective"? Again, not really, though here I'd say there's an important market segmentation that goes on. Most publics really are competing for students primarily in an in-state market, and they'll know the most about where they stand vis-a-vis other schools in that market. The top privates are competing in a national market. Below the top 50 or so, most privates really are primarily regional schools; state boundaries don't matter as much as for publics, but they'll draw students primarily from their own state and neighboring states. A few top publics---Michigan, Virginia, William & Mary, North Carolina ,etc, less so the UCs because they're statutorily constrained on OOS enrollment---have a foot in both worlds, competing in their own in-state market where they tend to be dominant, but also competing with the top privates in the national market. What does any of this have to do with teaching? Well, teaching is a major factor in those student decisions, especially at the top of the heap. The provost at Michigan knows his school regularly loses students to places like Duke and Amherst, not because their faculties are better on a scholarly level (though both have a lot of good people on their faculties that the provost at Michigan would like to hire if he could, but he wouldn't trade his own faculty for theirs); the Michigan provost knows he loses students to Duke and Amherst because of their lower student-faculty ratios and their reputations for teaching excellence, so he's going to give them a stronger PA score than, say, Syracuse (not to knock Syracuse in particular).</p>

<p>Finally, while they probably don't know the finances of many other schools nearly as well as their own, they do have a pretty good idea of comparative financial strength. They know who can outbid them for faculty and who generally can't. They know who's providing extra perks to students and who's being forced to cut back. They know who's providing the biggest and best library resources and research labs, something that matters to students but especially to faculty. They keep an eye on how generous the financial aid packages are at the schools they're competing with. And even though most publics are competing for students primarily in an in-state market, they do a lot of benchmarking by keeping an eye at out-of-state institutions: what are the latest trends and innovations, how are they working, are they moving the needle on faculty recruitment and retention, student recruitment, etc. Again, complex facts, not easily reducible to simple metrics, but a matter of expert professional judgment of a kind we see in all competitive fields, not just in higher education.</p>