<p>I thought that it would actually go up</p>
<p>why is it falling?</p>
<p>I thought that it would actually go up</p>
<p>why is it falling?</p>
<p>It's not falling, it has been fairly stable in USNWR for a few years mostly because breaking into the top 20 is difficult and takes times. If you're referring to how Emory is 20th now and used to be 18th for a few years, that doesn't mean anything. Emory's score is 79 while ND and Vanderbilt (tied for 18th) have a score of 80, nothing to be concerned about. If you're referring to another ranking let me know.</p>
<p>We actually should have been 16th in the rankings last year, but the administration misreported our alumni giving rate. No joke!</p>
<p>i think emory was ranked at 9th in 1998(!)
how was that possible back then?</p>
<p>Not sure about that, but really, the rankings are based on so many different factors that there may not be a real, tangible reason for why a school rises or falls in the rankings.</p>
<p>i just saw that article about misreporting alumni giving rate
if thats true, will emory go up a little bit this year?
btw, when is 2007 ranking coming out?</p>
<p>In August.</p>
<p>1998 was a fluke year in which they made graduate research a ridiculously highly weighted component. All of a sudden Caltech shot to 1, Hopkins was 7, and Emory was 9. They realized the ranking flaw and the ranks quickly settled back to where they have usually been.</p>
<p>In the first year that USNWR published their rankings, Stanford was #1. The ups and downs for different colleges in the past had to do with the way that USNWR kept changing its formula rather than actual changes by the colleges. USNWR has stabilized lately.</p>
<p>
Youre way off base on this slipper1234. What makes you think that research was ridiculously highly weighted? Im pretty sure the only justification for that is because HYP werent the top three not any a priori argument from the weights themselves. Indeed the system in 2000 (your year is off <a href="http://thecenter.ufl.edu/usnewsranking.xls%5B/url%5D">http://thecenter.ufl.edu/usnewsranking.xls</a> ) simply changed using the total spending amount instead of the relative ranking of schools for spending.</p>
<p>i.e. how it was before 2000, and how it was after is:
[quote]
Consider the data from the 1997 book, the last year the numbers for overall expenditures were posted publicly. Caltech spent the most of any college at $74,000 per student per year, Yale spent the fourth-most at $45,000 and Harvard spent the seventh-most at $43,000. According to the U.S. News formula applied in every single category except for national universities, the absolute rates of spending would be compared and Caltech would be credited with a huge 40-percent category advantage over Yale. Under the formula used solely in this category the difference between Caltech and Yale (first place and fourth place) was counted as essentially the same as the difference between Yale and Harvard (fourth place and seventh place) even with the vast difference in absolute spending.
[/quote]
<a href="http://www.washingtonmonthly.com/features/2000/0009.thompson.html%5B/url%5D">http://www.washingtonmonthly.com/features/2000/0009.thompson.html</a></p>
<p>In 2000, they simply gave Caltech full credit for the dollars/student instead of simply saying they were the top spender. Now is that scientifically a more valid way of ranking institutions? It certainly is more intuitive why would the ranking between schools matter opposed to the actual dollars spent per student? However, you could argue that the weights are off, but the point is, there isnt a scientific, methodical system of weighting and includes quite a bit of arbitrariness. I would argue that the weights are chosen in order to make sure the correct schools end up on top. </p>
<p>From the same article:
[quote]
When Elfin was first charged with creating a ranking system, he seems to have known that the only believable methodology would be one that confirmed the prejudices of the meritocracy: The schools that the most prestigious journalists and their friends had gone to would have to come out on top. The first time that the staff had drafted up a numerical ranking system to test internally--a formula that, most controversially, awarded points for diversity--a college that Elfin cannot even remember the name of came out on top. He told me: "When you're picking the most valuable player in baseball and a utility player hitting .220 comes up as the MVP, it's not right.
[/quote]
[quote]
To Elfin, however, who has a Harvard master's diploma on his wall, there's a kind of circular logic to it all: The schools that the conventional wisdom of the meritocracy regards as the best, are in fact the best--as confirmed by the methodology, itself conclusively ratified by the presence of the most prestigious schools at the top of the list. In 1997, he told The New York Times: "We've produced a list that puts Harvard, Yale and Princeton, in whatever order, at the top. This is a nutty list? Something we pulled out of the sky?"
[/quote]
So my point isnt that the ratings are useless, but that you should be aware that the first 5-10 spots are essentially used to calibrate the rankings to common perceptions. Consequently, when you say that they realized the rankings were flawed, it had little to do with the methodology being unscientific or flawed but rather the results of the rankings not aligning with the general readers perceptions.</p>