<p>^ “As it is it’s ridiculous that it’s weighted at 25%”</p>
<p>I suspect that at some time in the past USNWR wasn’t getting “believable” ratings with it’s data-based results. For better or worse PA is in for 25%. I mean if the system is to include stuff like Nobel Laureates and National Academy of Sciences in the mix, the top would be bereft of UG-only schools.</p>
<p>Still, the two areas that stand out as weaknesses (tip 'o the hat to bluebayou and lewmin) are freshman retention and Peer Assessment. I’m a bit ambivalent about methods to increase freshman retention … I’ve always felt that Tulane’s reputation as a “no handholding school” was a considerable feature of the school. But I see the other side of the issue too. I cede my vote on this.</p>
<p>As for the PA issue, something smells rotten in Denmark. How can a school with excellent students, high SATs, plus high Grad school, Med school, and Law school admission rates be so poorly thought of?</p>
<p>Also, on the comments regarding freshman retention, is it really fair to compare freshman retention for a school like Tulane where the overwhelming majority of students come from over 500 miles away versus a state school where the majority of students are more “home-grown” and desire to stay close by? I’m thinking that the retention rate at Tulane is somewhat impacted by those students who eventually chose to be closer to home. State schools would not have that problem.</p>
<p>I can think of no industry or enterprise where people really have thorough knowledge of the others in their market, and that would be especially true of universities, where there are so many. Granted, there is a lot of interaction between faculty of different schools, but the is also similar interaction between people that work in hospitals, retail stores, restaurants, etc. At least with restaurants a critic can actually attend the different ones, and the criteria for rating it are limited to food, service, atmosphere and price. The people at other universities have only superficial knowledge of all but a handful of other universities, and certainly it is impossible to get into any depth by attending each one. Johnny Utah has it exactly right that PA is bogus on its face, and at 25% when the quality of the students themselves is 15% is a travesty.</p>
<p>I will repeat myself because I apparently have to: the guy that came up with this system did NOT start from a scientific basis of hypothesizing what would make one school better than another and then accepting whatever result came out. He played around with the criteria and their weighting until he got Harvard on top, Yale second, etc etc or similar results to what he expected. It is obvious by the very criteria and weighting that is used. Would anyone really come up with that based on scientific studies and common sense? No way. And I cannot emphasize enough that it is well known that some school cheat on their self reported statistics, with top 10% of high school class being a prime example. I mean really, when U Miami reports an average GPA over 4.0, you know the whole thing is rigged. that last is from Princeton review, but you get the idea.</p>
<p>With regard to the National Acadamy of Science debate: Being in the sciences, I know some schools push it far more than others, and of course once you have a lot of people in from your own institution, it is more likely you will get others in. And it was also correctly stated that many of these schools are more graduate student oriented than Tulane, yet these rankings are supposed to be for the undergraduate schools. It makes extremely little difference to an undergraduate how many Nobel Prize winners, Fields Medal winners, NAS members, etc a school has. Is anyone here seriously going to argue that Tulane’s chemistry, physics and biology professors are not as good at teaching their undergrads as Duke, Stanford or any of the others? Don’t try it because you would just be making a fool out of yourself, to put it bluntly.</p>
<p>It is an anecdote, but I did research in chemistry as an undergrad and got my name on 2 papers, one as the first named author because the research idea was actually mine. And not in podunk publications, but in the Journal of the American Chemical Society and Inorganic Chemistry (also published by ACS) the most prestigious chemistry journals in their field. Having then gone to grad school and having interacted with many university departments, I can say with certainty that only a handful of schools have professors that would let an undergrad pursue their own idea like that instead of focusing on the research the the prof was already doing. Tulane is one of them. Sorry for the longwinded reply, but this discussion is ticking me off.</p>
<p>So it all gets back to the criteria in rating what makes for the “best” (a ridiculous notion to begin with) undergraduate institution for a student. How many winners there are in the most cutting edge research areas is absolutely not one of them, and for USNWR to use it (indirectly through PA I guess, or do they use it directly?) is just another example of how off base they are.</p>
<p>Not sure if this was posted elsewhere today, but the following article was in Inside Ed discussing the ridiculous use of peer assessment by US News …thought you’all would like to see it in light of your discussion…</p>
<p>Rodney - Thank you, Thank you, Thank you. So right on. I owe you a drink (or two or three). And this was 25% of the rating. What a joke.</p>
<p>To be (ridiculously I admit) pedantic, is it really hard for anyone to imagine that people in the Northeast or the middle of Ohio or in Oregon just know about New Orleans, and by extension Tulane, what they saw on television? These are human beings, with all the usual flaws and foibles, and bigger egos usually. Then they hear, for example, that Tulane eliminated some engineering programs and superficially thought “Well, Tulane must be barely hanging on”. That is what is dragging Tulane down, along with the other data flaws mentioned earlier.</p>
<p>Thanks for the link Rodney. It looks like the posters here aren’t the only ones skeptical of the PA process and weighting. The L-O-N-G response from the Clemson Public Affairs Officer was a hoot, especially after photocopies of the Clemson PA forms were made public awhile back.</p>
<p>I know USNews rating is a big deal. Regardless, my son is happy at Tulane and can’t wait to return this Sat. He was accepted by numerous UCs that are ranked quite a bit higher but in the end he chose Tulane and has no regrets!!! That’s all I care. In fact, knowing what I know now, I would do it again and send my younger kid to Tulane if he so chooses.</p>
<p>yes, those are just totals per college as found through the NAS website search function. And thus, while its not adjusted per capita, IMO it does give a rough idea of what PA is all about – namely prestige, and prestige of (primarily) grad programs. Thus, it partially explains why the mid-tier UC campuses are ranked in the top 50, as is Florida.</p>
<p>In addition to Tulane, undergrad-focused schools like Wake Forest also take a hit on PA, even tho they provide an excellent undergrad experience.</p>
<p>fallenchemist: PA is what it is, and it helps sell magazines. There are many on cc who disparage it (see hawkette’s posts, for example). And yes, Miami can’t follow directions – the Common Data Set clearly requests reported gpa on a 4.0 scale. But (unlike Tulane) at least Miami publishes its CDS for all to see (that it can’t/won’t follow directions). And, Miami does readily admit that it’s 4+ reported gpa is weighted. Thus, folks can interpret the hard numbers.</p>
<p>bluebayou - yes, of course it is what it is. And what it is is wrong on so many levels it is difficult to know where to start. But if you look at the comments section of the article in Utah’s link, you will see that all (except Clemson) strongly support my crticism of it, and more eloquently than I did. They cite lack of knowledge of these peer institutions, poor methodology, the temptation to game the system, and on and on. These are not just random people, but (hopefully they are telling the truth) in many cases people that are actually involved in the process, and a person that has studied this kind of question for years.</p>
<p>So I am not sure what your point is. Mine is that the USNWR rankings were a travesty to begin with, and the fact that so many people have latched onto this highly flawed “analysis” that is indeed intended to sell magazines just makes it more of a travesty. I can discredit virtually every measure they have chosen except SAT/ACT averages, and if those are self reported with no auditing (which I do NOT think is the case) then I could discredit that too. So I am simply pointing out all these flaws in an effort to get fellow Tulane boosters not to get all kerfuffled about the new results. I respectfully disagree with those that take the position that since so many students/parents make their choice of school based on rankings, Tulane needs to “play the game”. But if these measures are flawed, does it make sense to implement changes to get better numbers to improve in these flawed measurements? I say no, and I especially say it because I follow the changes Tulane makes fairly closely. IMO they are excellent changes, the incoming classes are strong, and things are looking very good. Better, I think, to keep pointing out why the rankings are bull and why Tulane is a fantastic choice for many top students.</p>
<p>It’s brought up so much I have to ask and I’m sure it’s a ‘stupid’ question but, how long has this common data set thing been around? I don’t remember ever hearing about it back when I was in school. Granted, I don’t have any kids so I haven’t gone through the college selection process in a long time but I think it’s great and it’s a shame that TU doesn’t publish it. The more information for parents and students alike, the better. Of course back when I was in school, TU was tied with NYU and Lehigh in the rankings at #34, so things have changed. I think Tulane was like the 4th or 5th best in value back then too. Tulane also went undefeated in football. Wow, things have REALLY changed, lol.</p>
<p>Another thing I thought of (LOL). Maybe another reason the Cal schools do so well in the rankings (besides the BS top 10% numbers) is that there are so many schools in the system. It is reasonable to assume that someone at UCSD will give higher praise to fellow UC schools than is deserved. Possibly out of more familiarity, possibly out of a sense of fellowship. Either way, just another example of how flawed the methodology is.</p>
<p>^^My point (with no dog in this hunt), was to address JohnnyUtah’s, Lewmin’s earlier posts, as well as NewHope’s question in post #121. And, while Johnny posted raised an issue about the credibility of the PA (or lack thereof), he is intimately concerned about Tulane’s ranking.</p>
<p>As a scientist, I would hope that you would discredit test scores as well (even if audited). :)</p>
<p>Statistically speaking, 100% of USNews rankings is wrong on all levels (alumni giving?). But it is the big elephant in the room. Many people do care. Many people start their college search with USNews, and PR’s Top 370+, and Yale’s Guides, etc. (Whether they should is a different matter.)</p>
<p>I certainly agree with your last post, except for the test scores (well, partially agree). The reason I give some credibility to the test scores (and they are not perfect, but nothing is, but at least they don’t suffer from the same flaws as the other areas) is that they have been shown in many studies to correlate pretty well with success in college, especially in the freshman year. Which makes some sense, since those that are better prepared in math and language skills are going to have an easier time initially. As students progress (assuming they make it past freshman year) they tend to “catch up”, although there is always an edge to the student that did better on the tests. To some degree IN AGGREGATE they do measure intelligence and quality of education. Anyway, not particularly pertinent to the discussion at hand, but I wanted to respond (big surprise).</p>
<p>On a final note…in reviewing the table of the top 133 schools again by SAT scores, top 10% of HS class, and acceptance rate, I could not find another school that “immediately stuck out” so much as Tulane as being ranked disproportionately lower (with the only possible exception of Tufts, where their stats on the student body could also easily place them many positions higher). </p>
<p>Good thing Tulane was not ranked 52…it would have been on the second page and REALLY stuck out as all the other schools that follow it are significantly lower in these areas.</p>
<p>The Tulane numbers and overall ranking almost make it look like it’s a mistake. </p>
<p>Second final note…Besides the Peer Assessment being low, Tulane scored ridiculously low on the graudation rate over 6 years. Can this be correct? Although this category was only worth about 5 points, Tulane probably scored a possible 1.5 to 2 points out of 5 points in this area (just guessing)…another 2 - 2.5 points would have improved Tulane’s rankings a little bit…but yes, everyone is right that the peer assessment is the killer here.</p>
<p>I am in this deep, so I am going to try and find out what the deal is on that 6 year graduation rate. Mostly because I am curious; as you say, it is a minor factor. My preliminary research leads me to believe that it is just another statistic that is being manipulated by some schools, because there is no standard (and certainly no audit, again this is a self reported number) as to how you count this. Do you count transfer students? I can see an argument for a school to say “Well, we don’t know if this kid that transferred from Vanderbilt to UCLA graduated or not, so we won’t count him. We will only count the kids that we know are still trying for a degree after 6 years”. On the other hand, Tulane might be saying (and I don’t know this) “We are simply taking the people that entered in the freshman class and giving you the percentage of those that graduated from Tulane within 6 years”. I do know for a fact that these kinds of games go on in various athletic programs as they try and stay in compliance with NCAA GPA and graduation requirements. Also Tulane has a 5 year architecture program, which would definitely skew its graduation rate compared to a school that does not. I bet that is not taken into account by USNWR.</p>
<p>I would also ask the question: Since there are numerous reasons a student might not graduate in the usual 4-5 years, what exactly does it have to do with whether a school is a “best” college? I am sure they have some rationale, because they do for everything. But again, it seems to me there is no strong correlation, and certainly no proven one, between that statistic however one counts it and the quality of the undergraduate experience.</p>
<p>^ I’m curious about this as well. I think I understand the reasoning behind including this statistic … colleges should graduate their students. But there’s a world of difference between “We accept almost everyone who applies and most of our students never graduate from any college” (I’m thinking of #96 on the USNWR list) and “We’re highly selective school with a unique environment, and even though a significant number of our students decide our environment isn’t right for them, virtually all of our students graduate within six years.”</p>