how many of you whose first choice is princeton are apply to Yale early anyway?

<p>just wondering... and of course i meant "applying"...</p>

<p>How many of you whose first choice is Princeton but APPLIED EA to Yale anw?</p>

<p>well, i just received my likely letter to Princeton, so i'm not worried at all about early deadlines!!</p>

<p>When I first decided to apply early at Yale, Princeton was my first choice. However, after reading so many great things about Yale, and after figuring out that Yale has a better chemistry department than Princeton, Yale became my first choice and Princeton my second.</p>

<p>What makes you think the chem dept is better at Yale?</p>

<p>okay i may sound like a total noob right now, but what is a likely letter?</p>

<p>It is a letter sent to applicants early in the admissions process essentially telling them that they'll be admitted. It's used to lure athletes and other high-priority targets. See [url=<a href="http://www.collegejournal.com/forms/printContent.asp?url=http%3A//www.collegejournal.com/aidadmissions/newstrends/20030127-chaker.html%5Dhere%5B/url"&gt;http://www.collegejournal.com/forms/printContent.asp?url=http%3A//www.collegejournal.com/aidadmissions/newstrends/20030127-chaker.html]here[/url&lt;/a&gt;] for more information.</p>

<p>Weasel, according to ISI/Sciencewatch's University Science Indicators online site, Yale is ranked #1 in chemistry, just above #2 Caltech, #3 Harvard, #4 MIT, #5 UCSD. Harvard, Yale and MIT are the top 3 in the Academic Analytics chemistry ranking as well. Princeton is not in the top 10 in either, although it also has a strong program.</p>

<p>According to ISI/Sciencewatch's University Science Indicators most recent study, released just last month, if you calculate the average "placements" of universities in the rankings across all scientific fields (physical and biological sciences all together), the ranking you get is as follows:
1. Yale (average ranking of 2.67)
2. MIT (average ranking of 3.00)
3. Harvard (average ranking of 3.80)
4. Princeton (average ranking of 4.40)
5. Stanford and Penn (average ranking of 5.00)
7. UCSD (average ranking of 5.33)
8. Caltech (average ranking of 5.60)</p>

<p>The above is a pretty compelling list of the top scientific institutions in the United States.</p>

<p>thanks, posterx, now I don't have to search for those statistics anymore!</p>

<p>hbart724! WOW how did you get a likely letter from princeton?</p>

<p>In response to posterX's misleading and somewhat inappropriate set of statistics, I highly advise interested viewers to read PtonGrad2000's</a> incisive response.</p>

<p>Here's a link directly to the PtonGrad's post: <a href="http://talk.collegeconfidential.com/princeton-university/367958-princeton-techie-school-2.html#post4445142%5B/url%5D"&gt;http://talk.collegeconfidential.com/princeton-university/367958-princeton-techie-school-2.html#post4445142&lt;/a&gt;&lt;/p>

<p>If you do not wish to be misled by posterX's statistics, this is a must read.</p>

<p>For those readers who are new to these boards, one should learn to be skeptical of virtually everything certain posters write. In the case of one, you’ll be guaranteed that no matter which ‘statistics’ or rankings systems are touted, Yale will come out on top or close to the top. </p>

<p>Now the only way to guarantee this remarkably consistent result is to be highly selective about which rankings to claim as authoritative and then refer to old rankings when new ones aren’t as favorable.</p>

<p>Our friend has given an excellent example of both of these techniques. </p>

<p>First, ISI/Sciencewatch’s studies are not new as posterX suggests. They were last reported about a year ago and a new analysis won’t be done until the year 2010. This analysis is actually a study of faculty publication rates and the frequency with which those scientific articles are cited by other researchers. You can read more about it here:</p>

<p><a href="http://www.sciencewatch.com/nov-dec2006/sw_nov-dec2006_page1.htm%5B/url%5D"&gt;http://www.sciencewatch.com/nov-dec2006/sw_nov-dec2006_page1.htm&lt;/a&gt;&lt;/p>

<p>


</p>

<p>Well…not so fast, posterX. Here’s a good example of ignoring the current rankings when the older ones are more favorable to Yale. In the 2007 rankings for Chemistry published by Academic Analytics, Yale is not even in the top ten. The top three THIS year are MIT, Harvard and the University of Illinois at Urbana-Champaign. Here are the top ten:</p>

<p><a href="http://chronicle.com/stats/productivity/%5B/url%5D"&gt;http://chronicle.com/stats/productivity/&lt;/a>
<a href="http://chronicle.com/stats/productivity/page.php?year=2007&primary=4&secondary=40&bycat=Go%5B/url%5D"&gt;http://chronicle.com/stats/productivity/page.php?year=2007&primary=4&secondary=40&bycat=Go&lt;/a&gt;&lt;/p>

<p>2007 Chemistry Rankings from Academic Analytics</p>

<p>1—MIT
2—Harvard
3—U.of Illinois at Urbana-Champaign
4—U. of California at San Francisco
5—U.C. Berkeley
6—Scripps Research Institute
7—UCLA (tied)
7—Stanford (tied)
9—Northwestern
10—U.C. Berkeley</p>

<p>Now, in case anyone is wondering if I made a mistake by listing Berkeley twice, the answer is I did not. Throughout their reported rankings, there are mistakes of this sort, undermining their usefulness.</p>

<p>A good article about this most recent release is provided by the Chronicle of Higher Education. </p>

<p><a href="http://chronicle.com/weekly/v54/i12/12a01001.htm%5B/url%5D"&gt;http://chronicle.com/weekly/v54/i12/12a01001.htm&lt;/a&gt;&lt;/p>

<p>Here are a few excerpts. You’ll need to be a subscriber to see the entire article:</p>

<p>


</p>

<p>Both this year’s story in the Chronicle of Higher Education and their previous story point out the very questionable nature of Academic Analytics work.</p>

<p>As I’ve said before, wait for the imminent release of the newest study of the National Research Council in a few months. It is the only ranking of PhD programs that is universally acknowledged as authoritative.</p>

<p>By the way, in the last National Research Council study, Yale is also not in the top ten in chemistry. We’ll see what they say in February.</p>

<p>(<a href="http://www.grad.berkeley.edu/publications/pdf/nrc_rankings_1995.pdf%5B/url%5D"&gt;http://www.grad.berkeley.edu/publications/pdf/nrc_rankings_1995.pdf&lt;/a&gt;)&lt;/p>

<p>I should say in regard to the question about chemistry at Princeton that while it’s very good, it’s not considered one of the top programs in the country. It’s stronger in organic and biochemistry but weaker in inorganic and physical chemistry.</p>

<p>Finally, allow me to note that it would be a full time job to follow posterX around these boards to correct bad or badly biased information so I’ll refrain from doing so. As I’ve said before, however, we won’t allow him to get away with these antics on the Princeton board.</p>

<p>where did you the chemistry rank? and is that for undergrad or grad?</p>

<p>There is no need for personal attacks. Anyhow, thanks for pointing out that new AA data was released; however I don't believe that the 2007 set was actually published in a front-page, full-issue COHE feature like the 2006 data was. Looking through it also seems like there are some serious reporting problems in the 2007 AA data, but maybe it's just that the website is flawed.</p>

<p>In any case, it is interesting and I agree with you that the individual measures aren't necessarily that reliable. I pointed this out in detail in my last reply to you (which someone linked to above), noting that it might be better to aggregate several rankings together -- to take the average of ISI, AA, and NRC for example -- rather than rely on just one. See my earlier reply: <a href="http://talk.collegeconfidential.com/princeton-university/367958-princeton-techie-school-2.html#post4445427%5B/url%5D"&gt;http://talk.collegeconfidential.com/princeton-university/367958-princeton-techie-school-2.html#post4445427&lt;/a&gt;&lt;/p>

<p>One way that the 2007 AA numbers might be useful is to look at the total number of "top 10" appearances in each category. For example,</p>

<p>Biological and Biomedical Sciences (including biochemistry)</p>

<h1>of subfields appearing among the top 10, by institution</h1>

<p>Yale 12
Columbia 11
Duke 10
Harvard 10
Johns Hopkins 10
UCLA 10
UW-Madison 9
Stanford 8
UNC-Chapel Hill 8
NYU 7
Michigan 6
Cornell 5
WUSTL 5
UPenn 5
Minnesota 5
Chicago 4
UC-Berkeley 4
Caltech 3
MIT 3
Brown 2
U-Texas 2
Princeton 2</p>

<p>
[quote]
is that for undergrad or grad?

[/quote]
</p>

<p>All of these rankings purport to measure the quality of the faculty (in terms of research productivity, citations, reputation, etc.) in each department or subfield. The application of this measure to the quality of undergraduate education is necessarily subjective. For the most part, it's less important per se than the overall climate toward undergraduates. It is important, however, if you want to either take seminar courses or graduate courses or want to do significant undergraduate research. In both cases, studying under a renown professor will be of great benefit.</p>

<p>Incidentally, as PtonGrad2000 pointed out, the most respected and reliable ranking of departments is the one from the NRC. The ranking was last published in 1995, so the available data are somewhat obsolete. However, a new NRC ranking will be released very soon, so I believe it would be profitable to postpone this debate until we see what the results there are.</p>

<p>In the meantime, I think Brian Leiter offers the</a> most compelling and lucid reasons to be skeptical about faculty citation ranks such as many of the ones PosterX is referring to. Granted, he's discussing law schools, but the general ideas hold.</p>

<p>
[quote]
First, there is the industrious drudge: the competent but uninspired scholar who simply churns out huge amounts of writing in his or her field. Citation practices of law reviews being what they are, the drudge quickly reaches the threshold level of visibility at which one is obliged to cite his or her work in the obligatory early footnotes of any article in that field. The work is neither particularly good, nor especially creative or groundbreaking, but it is there and everyone knows it is there and it must be duly acknowledged. </p>

<p>Second, there is the treatise writer, whose treatise is standardly cited because like the output of the drudge it is a recognized reference point in the literature. Unlike the drudge, the authors of leading treatises are generally very accomplished scholars, but with the devaluation of doctrinal work over the past twenty years, an outstanding treatise writer—with a few exceptions—is not necessarily highly regarded as a legal scholar.</p>

<p>Third, there is the “academic surfer,” who surfs the wave of the latest fad to sweep the legal academy, and thus piles up citations because law reviews, being creatures of fashion, give the fad extensive exposure. Any study counting citations, depending on when it is conducted, runs the risk of registering the "impact" of the fad in disproportion to its scholarly merit or long-term value or interest.</p>

<p>Fourth, there is work that is cited because it constitutes “the classic mistake”: some work is so wrong, or so bad, that everyone acknowledges it for that reason. The citation and organizational preferences of student-edited law reviews exacerbate this problem. Since the typical law-review article must first reinvent the wheel, by surveying what has come before, the classic mistake will earn an obligatory citation in article after article in a particular field, even though the point of the article may be to show how wrong the classic mistake is. True, some authors of classic mistakes may have excellent reputations; but who among us aspires to be best remembered for a "grand" mistake?</p>

<p>Fifth, citation tallies are skewed towards more senior faculty, so that faculties with lots of “bright young things” (as the Dean of one famous law school likes to call top young scholars) won’t fare as well, while faculties with once-productive dinosaurs will. On the other hand, by looking only at citations since 2000, we have reduced the distorting effect of this factor.</p>

<p>Sixth, citation studies are highly field-sensitive. Law reviews publish lots on constitutional law, and very little on tax. Scholars in the public law fields or who work in critical theory get lots of cites; scholars who work on trusts, comparative law, and general jurisprudence do not.

[/quote]
</p>

<p>Which is not to say we should disregard these findings. They are imperfect but still informative. Since we'll soon have the NRC rankings, however, it's somewhat silly to jump to conclusions right now.</p>

<p>The NRC rankings are even more biased, in certain ways. Many researchers have written about exactly how and why they give preference to larger programs. You will see, for example, that Caltech often doesn't show up near the top of the NRC rankings even though it clearly has top-notch departments (as ISI/Sciencewatch shows), probably because it is a smaller school.</p>

<p>The only reason they are "anticipated" is because they were one of the first ones to be released.</p>

<p>I suspect that such criticisms have been taken into account in designing this latest ranking, although, of course, we'll have to wait and see. In any case, I would still argue that the NRC holistic rankings are superior to the raw citation-based rankings, although again I'll wait until the report is released to give a definitive judgment.</p>

<p>Edit: they have</a> revised the methodology, actually.</p>

<p>No personal attacks here, unless you consider correcting distortions and pointing out a pattern of bias to constitute a personal attack.</p>

<p>


</p>

<p>Here we have another great lesson. Speaking of selective statistics, any thoughtful reader attempting to avoid being misled by the above poster would note that the list he or she cites is obviously heavily weighted toward universities with medical schools. Most of the programs being evaluated in this grouping exist ONLY at those schools, thus the reason for the larger absolute numbers. Princeton, as our friendly poster has failed to point out, has no medical school and thus does not even offer most of the medical school related subfields being evaluated.</p>

<p>On the subject of the NRC rankings, they are considered the gold standard of PhD evaluations, though, of course, no system of evaluation is perfect. Still, there is no serious disagreement within the academic community about this.</p>

<p>As for your attempt to discredit the NRC rankings…</p>

<p>


</p>

<p>Obviously, posterX, you haven’t looked at the NRC rankings. Caltech, despite its small size and specialization was ranked second in the nation in the physical sciences and mathematics and third in the nation in engineering. Despite its narrow academic focus, it is so strong in the sciences that it STILL comes in 10th overall on the list of universities with the highest number of distinguished programs even though it doesn’t even have a chance to compete in most of those programs. Here are the top ten:</p>

<p>1—Berkeley (32 distinguished programs)
2—Stanford (28 distinguished programs)
3—Harvard (25 distinguished programs)
4—Princeton (24 distinguished programs)
5—MIT (20 distinguished programs)
6—Cornell (19 distinguished programs)
6—Yale (19 distinguished programs)
8—Columbia (18 distinguished programs)
9—Michigan (15 distinguished programs)
10—Caltech (14 distinguished programs)</p>

<p><a href="http://www.grad.berkeley.edu/publications/pdf/nrc_rankings_1995.pdf%5B/url%5D"&gt;http://www.grad.berkeley.edu/publications/pdf/nrc_rankings_1995.pdf&lt;/a&gt;&lt;/p>

<p>Once again, I would ask all posters to restrain themselves from the blatant misuse of statistics and the misleading use of rankings.</p>