<p>The new USNEWS database will use the data from the entering class in the Fall of 2004. The data is a year out of date.</p>
<p>Arcadia, thanks for the post. Please note that for 2009, I used the numbers for 2009 that are available here: </p>
<p>Applications 5,254
Admitted for September 1,241
Admitted Early Decision 261
Enrolling in September 555
Admitted for February 226
Enrolling in February 105 </p>
<p>As far as not reporting the February enrollment on the CDS, this would simply mean that the CDS has allowed Middlebury to understates its admission rates in the prior years. It seems to be most logical to view that the number of applications (5254) should include all successful applicants for the ED, RD, and February admits. There are always little variances and schools may not report the data uniformly, and use different methodologies to report wait list admits or later admits. </p>
<p>The number of top 10% for 2009 is indeed 84%, but I only listed the numbers for 2008. The focus of this post was to look at the data that will or should show up in the US News 2006 Edition. </p>
<p>However, there is a more noteworthy number in the 2009 academic profile of Middlebury: the reported SAT numbers seems to represent a drop of more than 100 points on a 1600 scale (from about 1430 and 1440 in prior years to a much lower 1315). </p>
<p>Academic Profile
Top 10% of Class: 84%
SAT 1 Verbal Mid-50% Range 620-710
SAT 1 Math Mid-50% Range 610-690
ACT Mid-50% Range 29-32</p>
<p>As I said earlier, I did not see the reason why US News lowered Middlebury in its 2005 Edition. The school had improved in most categories, except for a huge drop in Faculty Resources. If this was caused by an erroneous entry or not is only known to US News. To me, it seems rather strange that faculty resources could change from one year to another, at least to a degree sufficient to cause a drop of 4 positions in an otherwise stable ranking. Well, then there is the peer assessment that explains almost all changes!</p>
<p>Please note that some schools are not very quick in posting data about admissions, and schools such as Carleton are borderline secretive. Others restrict access to their CDS forms to authorized members or do not even bother making the reports available to the public. My own school is one of those culprits! On the other hand, a few schools post extensive and remarkable analyses of the trends of admission at their schools. This latter group should be applauded! </p>
<p>If you happen to have better numbers for 2009, please do not hesitate to send them to me! My most glaring estimates are for Carleton, Harvey Mudd, Colby, Haeverford, and Hamilton.</p>
<p>Davidson College 2009 numbers:</p>
<pre><code> http://www2.davidson.edu/admission/apply_icprofile.asp
</code></pre>
<p>Before the rankings come out, thought I'd share an interesting report I recently read on the pro's and con's of the individual measures US News uses to compute their rankings:
<a href="http://www.johnlocke.org/acrobat/pope_articles/collegeranking-inquiry17.pdf%5B/url%5D">http://www.johnlocke.org/acrobat/pope_articles/collegeranking-inquiry17.pdf</a></p>
<p>We also probably should pull up data on rises in faculty pay, endowments and alumni giving if we're going to predict the new rankings as all are weighted in the US News rankings too.</p>
<p>Thanks for sharing Carolyn. I agree with their assessment of the rankings in most areas. I will look like most of us, but they don't mean as much to me as they do to some.</p>
<p>"Before the rankings come out, thought I'd share an interesting report I recently read on the pro's and con's of the individual measures US News uses to compute their rankings:
<a href="http://www.johnlocke.org/acrobat/po...g-inquiry17.pdf%5B/url%5D">http://www.johnlocke.org/acrobat/po...g-inquiry17.pdf</a> "</p>
<p>It would seem to me that, barring output measures that can correct for what students bring into the process (and hence don't represent "value added" on the part of the institution), the best, though imperfect, measures of quality are student surveys such as NSEE and COHFE (the results of which most schools won't make public), or ones with further flaws such as the Princeton Review. They are far from perfect, but will likely paint a more accurate picture, at least from the point of view of the students who actually attend. Beyond that, it is very difficult to go.</p>
<p>I mourn when I see higher and higher selectivity on the part of "top-ranked" institutions because it often means more disappointment on the part of students who could perform as well or better than those who are accepted. Luckily, and due to the huge number of wonderful colleges and universities in the U.S., the disappointment doesn't last all that long.</p>
<p>"We also probably should pull up data on rises in faculty pay, endowments and alumni giving if we're going to predict the new rankings as all are weighted in the US News rankings too."</p>
<p>Carolyn:</p>
<p>My only interest in tracking a few numbers is to understand their relative impact on the rankings. </p>
<p>I am probably one of the harsher critics of the US News rankings. I have posted on this issue as nauseam and it is probably a waste of time to rehash my absolute disdain for US News reliance on subjective data, especially the ultra-manipulated and questionable peer assessment that distorts all other measures. </p>
<p>While I agree that faculty resources, endowments and alumni giving could have a relative importance to some, those numbers should change a snail pace and not have a profound impact on YEAR to YEAR changes. I also have little doubt that the figures reported to US News for alumni giving are as subject to manipulation as the previously mentioned peer assessment. Regardless of their relative value, I do not think it to be worth any of my time searching for these arcane numbers, with the notable exception of the uses of the endowments. For what is worth, I believe that US News could simply track the year of foundation of schools and give it a weighing of 10% and it would be more meaningful than half of the current criteria. In the same vein, they could simply assign scores to the tuition costs since for US NEws higher tuitions mean higher quality. </p>
<p>It is not hard to see how disingenuous the rankings of US News tryly are when the lowest criteria are:<br>
* Class Size 50+ 2%
* Acceptance rate 1.5%
* Percent Full Time 1%
* Student/Faculty Ratio 1% </p>
<p>Also, do not let the 7.5% weight given to SAT Scores fool you, US NEws made sure to penalize the best performing score with their asinine penalty aka as graduation underperformance numbers. </p>
<p>This is the list of weighed criteria:</p>
<p>PS The list of the weighed criteria: </p>
<p>Peer assessment 25%
Avg Graduation Rate 16%
Financial Resources 10%
SAT Scores 7.5%
Faculty compensation 7%
Class Size 1-9 6%
HS - Top 10% 6%
Alumin Giving 5%
Graduation Rate Performance 5%
Avg Freshman Retention 4%
Faculty Degrees 3%
Class Size 50+ 2%
Acceptance rate 1.5%
Percent Full Time 1%
Student/Faculty Ratio 1%</p>
<p>One of the most "fun" thing to do would be to see how well USNW rankings in the main (I don't just mean in the top 10) correlate with average family income/assets of students attending).</p>
<p>Colgate WILL move up this or next year... astounding increase in application and decrease in admit rates... more than anyone in top 20. Just unbelievable. Smith and Wellesley too.</p>
<p>Tickle, not so fast. You have to go one year at the time. Let's see if this does not become a parody of the procession of Echternach. </p>
<p>While Wellesley improved in selectivity and increased its number of applications, Smith had the largest decrease in applications (Minus 9%) among the twenty schools discussed in this thread. Despite having the highest admit rate, Smith also had the largest increase in admission rate when its admit rate reached 57%. Colgate did also witness a loss in its number of applications as well as a higher admission rate. For those schools, there is nothing to justify a movement upwards in the 2006 Edition.</p>
<p>Big "winners" -Grinnell in the top 20, and Rhodes in the top 50. Seat of the pants. No methodology.</p>
<p>What made Claremont McKenna surge in popularity?</p>
<p>
[quote]
What made Claremont McKenna surge in popularity?
[/quote]
Could it be the cachet of having Xiggi there? :)</p>
<p>That too, but all to the Claremonts are gaining in popularity, students are applying to 2-3 and the saftey schools there such at Pitzer are no longr safteys</p>
<p>I hope Wellesley holds onto a spot somewhere in the top 5- and I think they'll stay where they are at #4 :).</p>
<p>Let's take a different tact for a moment --- if were were on the consulting board for US News' ranking system, what additional or other factors would you consider?</p>
<p>I'd add factors related to OUTCOMES, perhaps percentages of kids with full time jobs within 6 months of graduation, percentage going on to graduate school, average starting salaries of graduates, the NSEE data mini mentioned (hard to boil that down to one number though). Maybe a comparison between average incoming SAT scores and outgoing graduate exam scores of graduating seniors would also be interesting.</p>
<p>I'm seeing a lot of colleges y'all have suggested could move up. Which means some colleges will have to move down. Any predictions for downward movement?</p>
<p>How could the US News change?</p>
<ol>
<li> They could recognize that the TITLE of their ranking is a misnomer. Since they are not able to measure the QUALITY of the schools, they have no business pretending to designate the BEST schools in the country.<br></li>
</ol>
<p>Since the book is MOSTLY used by families during the ADMISSION process, the main subject should be exactly that: how difficult is it to gain admittance to a school. How many people REALLY read the US News to ascertain the QUALITY of the education after being admitted? </p>
<ol>
<li><p>The pages that do not contain "rankings" are well worth reading. The problems are in the beginning pages. I would prefer them to offer a number of rankings. Since the focus of their rankings is on the subjective area, they could come up with a ranking for the "most reputable" 100 schools divided between universities and LAC's. In this "REPUTATION" ranking, their dubious peer assessment would shine along with a number of other subjective criteria. </p></li>
<li><p>The remaining measurable data on ADMISSIONS could be organized in a manner similar to the current one. </p></li>
<li><p>The date on QUALITY could be presented in a separate ranking, aptly labeled "The US News view on quality." </p></li>
</ol>
<p>This way, everyone could focus on the pages and ranking they like. </p>
<p>I also believe that there is a lot of merit in analyzing the "exit" data, but that should be an entirely different process.</p>
<p>I agree... 57% is quite high to be in that position for USNEWS...</p>