“The formula used … is … highly suspect.”
You could always make your own list using your own formula.
Btw, “caveat emptor” is plain English to the majority of posters on CC.
“The formula used … is … highly suspect.”
You could always make your own list using your own formula.
Btw, “caveat emptor” is plain English to the majority of posters on CC.
The formula is suspect because it relies on data that has no verifiable sources for a number of schools. Ask the OP to show the CDS used for Chicago and I might retract the horse manure part.
Since I know he can’t, my post stands. The OP is playing voodoo with numbers for an obvious purpose.
tk posts are suspect, what else is new
I totally agree with xiggi.
With respect to Chicago, the school is also second on a more official national list, though the basis is SAT only, “The 600 Smartest Colleges in America.” (Business Insider.) Personally, a Chicago ranking of second versus, hypothetically, ten places lower, on either list of selectivity, would not change my opinion of the school.
I don’t see Penn State why is that
@tk21769:
I’m not going to go so far as xiggi in characterizing your list as “horse manure,” (especially after I complimented your contribution to my recent post about the Claremonts…lol!).
But, there is something about the list that just doesn’t seem right…
I am referring to Claremont McKenna,which I know a little about.
I will admit that, as an alum, I am slightly (ever so slightly) biased.
But, I’m putting my ‘objective’ hat on, and - while I don’t doubt the numbers you used-
there is one fact cited that seems incomprehensible to me.
CMC was 22nd in selectivity in 2009, according to Papa Chicken’s formula.
Then, in your calculations, it emerges as 35th this year?
Sorry but that is not possible. i don’t know whether your weighting system needs to be changed,
and I understand that acceptance rate is not the only measure of selectivity
(my thread highlighted the fact that CMC and Pomona were tied for most selective LAC in the country this year,
using - admittedly - only acceptance rate as a barometer).
But, one has to realize a couple of facts here. First, CMC is by far the youngest college on any list
of top LACs (except for the two tech schools, Mudd and Olin).
Its rise in the ranks of top colleges has been nothing short of meteoric, ESPECIALLY the last couple of years,
where, by every measure, be it scores, acceptance rate, etc., the College is light years ahead of where it was,even a year or two ago. That is one reason why it is unfathomable that it could go from 22 in '09 to 35 in '15.
Lets take a couple of examples of two great universities with which i am quite familiar, both in Califonia;both, on your list; both, ranked higher than CMC in selectivity, according to your formula:UC Berkeley(#26) and USC (#34).Now, i am a fan of both of these terrific schools. UC Berkeley is arguably the #1 research university in the country, and my brother - who is smarter than i am - transferred from Pomona to Berkeley.Same goes for USC…in fact,it has made great strides in recent years, similar to Claremont, in the academic standards of its undergraduate students, and i understand it is becoming quite selective.
But, reality is reality. And the reality is, as nearly everyone who is familiar with colleges in these parts knows,
that Claremont McKenna is simply more selective for undergraduate admission than any of the UCs (including Berkeley), and SC. I could probably make the same argument regarding 10-12 others on your list, but decided
to limit this to these two California schools.
i will leave it to others,more skilled than I am in this methodology, to suggest specific changes to your
numbers/formula (PapaChicken? i believe you had kids at Claremont. xiggi? a Claremont grad, no?).
My main point in writing this is, not to get into a ‘pissing contest’ over Claremont vs other schools, because (a)
that is pointless, and (b) every one of these are great colleges.
my point is simply: CMC#22 in '09 to # 35 in '15? NOT POSSIBLE!
Finally, below is part of a letter to the editor of CMC Magazine I wrote,which will be in the next issue.
Dear Editors:
Recently, I had the honor of being invited to sit in on an all-day meeting
of the Admissions Committee, and it was a real eye-opener! …What really stood out was
the ‘holistic’ nature of CMC’s admissions process…the group spent anywhere
from 5-15 minutes discussing EVERY applicant. Since this was the 2nd or 3rd
round these applicants had already been ‘vetted,’ and certainly not all
7,500+ applicants were subjected to this level of scrutiny. But the quality
of each person who had made it to this stage of the process was truly
astounding.
I thought about this as I looked at the last issue of the magazine, which
describes the College’s “imperative” to increase access for students.
It became clear to me that, having turned down it’s fair share of
valedictorians and perfect 800 SAT applicants, CMC is about more than just
the numbers.
There’s a thoughtful process going on to make sure that the best students
are here. I saw it for myself. That group kept pushing each other to
consider the ‘intangible’ qualities of each: Which would be a better fit?
Which one shows ‘leadership potential’ sufficient to gain one of those
‘coveted admit spots’? Clearly, CMC is still attracting the ’best of the best."
Xiggi, what I think you are trying to express is that the sources are suspect.
If you also think the formula is suspect, please explain.
Yes, you and I both know there is no CDS for Chicago. I have used the USNWR “Ranking Indicators” entry for each college, except as noted. The Ranking Indicators are posted to pages that do require a subscription. However, as I’ve indicated, the numbers on those pages come from the 2013-14 Common Data Sets. If you have any questions about specific entries, check the 2013-14 CDS. Or I can do it if I have reason to believe an error has occured (as it did with my Cornell entry).
There may be other entries (besides UChicago’s) that have no CDS behind them.I haven’t noted those, but Xiggi if you’d like to perform a public service by identifying them, knock yourself out. If you have any other reason to believe that some of the source numbers are factually wrong (as one poster did in the case of Trinity College), you’re welcome to point out the suspected errors and help track down the correct information.
What “obvious purpose” would that be?
In another recent thread, a poster suggested constructing a list such as the one I’ve offered.
http://talk.collegeconfidential.com/college-search-selection/1766153-top-ten-lacs-are-their-safeties-p3.html
(post #37)
I suggested that such a list already had been developed, by PapaChicken in 2009.
Another poster responded,
“The information is too out of date to be useful.”
http://talk.collegeconfidential.com/college-search-selection/1766153-top-ten-lacs-are-their-safeties-p4.html
(posts #50 & #51)
See post #1 (the note under “Included Colleges”).
Penn State is tied for #48 in the USNWR National Universities ranking.
I did calculate a score for it.
It would have been #104 in my list (behind Soka and Yeshiva).
Regarding Claremont McKenna, I think the 2009 list would have included data for a year for which the school has since admitted they were reporting false data. Bucknell and Emory were also falsely or unreliably reporting around this time. Washington & Lee currently reports their acceptance rate using questionable standards.
Pretty sure the Penn State poster knew and reluctantly accepted that the school simply got boxed out of this list of only 101 very selective colleges.
Re: Claremont McKenna (post #26 above).
stagalum, CMK’s individual sub-ranks are as follows:
55th for % of students in HS top 10% (74%, identical to Middlebury)
30th for combined SAT-M + SAT-CR (1320-1510, ~= Georgetown, Wellesley)
14th for admit rate (11.7%, between Caltech and UPenn)
I took the numbers in parentheses above (74%, 1320-1510, 11.7%) from CMK’s USNWR “Ranking Indicators” page
(which was more convenient for me to use than hunting around for each CDS). I’ve now checked them against the CMK 2013-14 CDS. I do not see any discrepancies in these figures between the CDS and the “Ranking Indicators”.
I’ve used the same weighting PapaChicken described in 2009 (50% for SAT, 40% for class rank, 10% for admit rate).
I ranked each of these factors separately (1-100), then summed up the 3 weighted ranks, then sorted the results.
The following schools dropped 10 or more positions since 2009:
Swarthmore, Claremont McKenna, Davidson, Brandeis, Oberlin,Grinnell, Lehigh, Mt. Holyoke, Whitman, USNA.
Note that almost all of them are LACs.
Your methodology puts too much weight on high school class rank. Mostly public schools rank where a high rank is much easier to attain. William & Mary reports that 32% of students score between 24 - 29 on the ACT but 80% of students are in the top 10%. This suggests that the high schools are not that competitive on average. I would also assume that a much larger proportion of students come from public schools where a top 10% ranking is not so hard to get on average.
Likewise UCLA reports an average ACT of 28 and a whopping 97% of its students in the top 10% of its class. What does that tell you?
You have it ranked with Washington & Lee with a 31 average.
To put so much weight on such a dubious metric doesn’t make a lot of sense.
The following 8 colleges, as far as I can tell, do not publish Common Data Sets:
UChicago
Columbia University
Boston College
NYU
US Air Force Academy
US Naval Academy
Bard College
Soka University
PapaChicken would have had the same issue with his 2009 sources (presumably for all/most of these colleges, and for others as well.)
Certainly it is very desirable to have scores and data presented in a common format and subject to a single set of reporting guidelines. However, my understanding is that all CDS data is reported by the colleges. It is not (as far as I know) independently verified.
W&L apparently has been including incomplete applications in its admission rate calculations.
ww.washingtonpost.com/local/education/washington-and-lee-counts-incomplete-applications-amid-debate-over-college-data/2013/09/16/17fa0a88-1714-11e3-be6e-dc6ae8a5b3a8_story.html
Several other colleges cited in the report apparently have followed similar practices.
These issues are among the many reasons why one shouldn’t put too much emphasis on one or a few metrics. Even without nefarious intent, it is easy for errors (or varying interpretations of the reporting guidelines, or the calculation methods) to distort the results. In my opinion, the appropriate response to these problems is to keep plugging away to improve the information. Look for apparent anomalies in the patterns (for the current year or across time), just as we’re doing here; try to track them down and correct as needed. Then compare various metrics to see if they are mutually corroborating.
Here’s an alternate ranking by SAT scores alone:
http://www.stateuniversity.com/rank/sat_75pctl_rank.html
Here’s one by ACT scores alone:
http://www.stateuniversity.com/rank/act_75pctl_rank.html
@tk21769 you need to use averages not such a narrow band. That is the only way to evaluate. This is why US News rankings have so many ties in the top 25 of the two rankings.
Most schools (and as you note, in particular private schools) don’t report class rank, which make it hard to assess the statistical significance of small differences in this stat.
At Williams only 31% of applicants report a class rank. At Pomona this number is 23%. CalTech 33%.
A few additional institutions that may be worthwhile exceptions to consider:
Coast Guard
Merchant Marine
Webb
Villanova
“UCLA reports an average ACT of 28 and a whopping 97% of its students from the top 10% of their HS classes. What does that tell you?”
I am not entirely familiar with the California University system, but I think for some of the universities a standing within the top 10% of one’s high school class is a virtual requirement for admission. If this is the case, then other factors, whether standardized or holistic, may be lessened in importance.