<p>Wlidlion, </p>
<p>If they have three separate admissions processes shouldn't they submit the data separately. If they have one admissiions process, I would think they should pool the data. Am I oversimplifying?</p>
<p>Wlidlion, </p>
<p>If they have three separate admissions processes shouldn't they submit the data separately. If they have one admissiions process, I would think they should pool the data. Am I oversimplifying?</p>
<p>Understood, WildLion82. I was just correcting the impression that Barnard's statistics aren't reported. I don't want to get into the "Barnard v Columbia" debate at all!</p>
<p>
<p>I understand that the term "selectivity" might not please everyone, but the USNEWS uses to clearly represent ONE thing: the selectivity of admissions. They measure the statistical "quality" of the freshman class for standardized tests AND high school performance, and then the admission ratio. Nothing more, nothing less. It is USNews' criteria, but a reader has the luxury to pay attention to it or ... simply dismiss it. Its meaning, however, is very, very straightforward.
I wasn't referring to USNews' system when I opined that SLC was extremely selective, more my daughter's own experience in not being admitted and how many of us use the term in the vernacular. Even if you use other rankings (and I don't have access to USNews' since I refuse to pay them), SLC is rated much more selective than their admission offers would suggest. According to Princeton Review, SLC admits nearly half its applicants (46%) and yet carries a selectivity rating of 90. PR uses a different set of variables for its selectivity rating which does include the self-selecting nature of the applicants:
[quote]
Admissions Selectivity Rating
This rating measures how competitive admissions are at the school. This rating is determined by several institutionally-reported factors, including: the class rank, average standardized test scores, and average high school GPA of entering freshmen; the percentage of students who hail from out-of-state; and the percentage of applicants accepted. By incorporating all these factors, our Admissions Selectivity Rating adjusts for "self-selecting" applicant pools. University of Chicago, for example, has a very high rating, even though it admits a surprisingly large proportion of its applicants. Chicago's applicant pool is self-selecting; that is, nearly all the school's applicants are exceptional students.
[/quote]
Also, I wasn't suggesting that one student's stats could be considered data, but simply that my daughter's experience in talking with others at her school is that while SATs did not enter into the admissions process, that did not mean that kids with high SATs did no apply---or attend---SLC. The reference to an anomaly was to suggest my D's SATs would not be thrown out as the extreme for the sake of statistical "accuracy", but since the school elects not to record SAT data, anecdotal is all we've got to go on. It ain't data, but it does strongly suggest what USNews has passed off as data for the past three years is highly suspect itself---and perhaps even lower than actual, then this cannot be proven either way. But to suggest that USNews' assumption of a 200-point (33%!) penalty for non-reporting is somehow accurate is not only disingenuous but also pretty obviously false.</p>
<p>Curious,</p>
<p>You might be oversimplifying, although you're not necessarily wrong. Unless I misunderstand, it seems as though what you're proposing is that the central administration takes a "hands-off" approach and allows each separate undergraduate school to report or not report as it pleases. I'm not sure about Cornell, but I know that Columbia does not give SEAS and GS the option to report. CC alone has that privilege. </p>
<p>In this case, it would still be necessary to standardize exactly who reports and what they report, and it would have to be noted in the rankings when data is "incomplete." Personally, I would like to see the central administration (or the school's public relations department, even) be responsible for gathering the data, and US News should specify that data is due for all undergraduate colleges. </p>
<p>Essentially, it comes down to defining what, exactly, is a "separate" admissions process. In my view, the nature of multi-school undergraduate programs--which describes most of the National Universities category--is that there are slight differences in the standards and criteria among the different schools, but applicants are ultimately part of the university in question. </p>
<p>It seems as though ranking a university based on only one of its undergraduate schools is a bit like ranking a football team based only on its secondary: there's nothing wrong with that sub-group, but it may not be an accurate representation of the big picture. The magazine claims to rank universities, not simply undergraduate liberal arts divisions of universities. </p>
<p>Chedva,</p>
<p>Haha--yeah, I've made the mistake of engaging in that Columbia-Barnard debate one too many times. Always a mess. :-)</p>
<p>
[quote]
I am surprised that no one has mentioned the separate rankings for universities and LACs. There is clearly a lot of overlap and there are some institutions which are in between.
[/quote]
That's because LAC's are often invisible and the subject of another furious debate. However, I think self-i.d. works pretty well...I have no problem accepting Tufts (5,000) as a U., Smith (2600) as an LAC. I remember digging once and there were few if any institutions in the middle zone between the two. And the Roman Empire fell in A.D. 476, give or take a few years, depending upon how you define it.</p>
<p>Like many here, I used USNWR - especially when researching schools that had been recommended to us that I had previously known nothing about. Finding a school on the list was reassuring - they must be doing something okay.</p>
<p>I just checked out the link someone posted on page 1 of this thread, with the Ph D rankings that let you weight criteria. First, I tried it for linguistics, my D's potential major. It came up with a lot of schools I wouldn't have expected, but included quite a few that are ranked high by other rankings.</p>
<p>Then I tried it for music, which is S2's major. Interestingly, the choices they gave me to rank the school with had nothing to do with the criteria he's looking for. He does not care if his teachers have even graduated from undergrad, let alone advanced degrees. (His current teacher at Juilliard did not graduate from college.) He does not care if they did research. They didn't ask about the quality of the orchestra, or the number of students gaining employment following graduation.</p>
<p>So the resulting list had no schools whatsoever that are even on his radar at this point.</p>
<p>This exercise to me was eye-opening in that rankings are often done based on criteria that mean nothing to me or my kid or what they want out of college.</p>
<p>Binx, your point is an important one and is something that is easily overlooked. These rankings are for mass consumption but are by no means the exhaustive list of excellent schools for every graduating senior.</p>
<p>Obviously, somebody looking to major in music at the Juilliard level is not going to give a hoot about almost any of the top 100 "National Universities." </p>
<p>I think another point that gets missed is that the majority of college-bound seniors--many of them extremely talented and realistic candidates for top schools--don't have designs on anything other than their state's flagship university, if not even their local branch school. The regional school rankings, wherein a lot of local schools and hidden gems lie (many of them typically more loose with merit aid than their bigger counterparts), could probably be more helpful to most people than the big national rankings.</p>
<p>That being said, the target audience for US News is the College Confidential crowd and its somewhat narrow group of peers, not John or Jane HS Senior in Sioux Falls, SD, who is content to take a full ride at his or her local state campus.</p>
<p>ProudDad, I'm afraid that understanding how the Princeton Review Ratings are collected and composed might lift the veil about the integrity of that outfit. It is one thing to publish a methodology and another to be able to sell the pseudo science behind it. </p>
<p>To understand the absolute idiocy of the PR rankings, one has to go to its foundation: a series of uncontrolled, unscientific, and unverifiable surveys. In following the democratic mantra, one could vote as early as ... often. </p>
<p>Here's an example of a couple of years ago. The PR surveys did not see the utter ridicule of assigning a rating of 99 to UC Davis and a 98 to Chicago. </p>
<p>This said, a ranking of 90 (equal to Kalamazoo and between the 92 of Agnes Scott and the 88 of Earlham) is about in line with USNews ranking that places SLC around the 70th LAC position, or at the start of the second third of the 200+ schools in that category.</p>
<p>PR's methodology is deeply flawed and yet its qualitative descriptions of schools as gleaned from the aggregate comments seem to map closely to what I've experienced when visiting. Imperfect, certainly...but then so are overnight visits.</p>
<p>Wildlion,</p>
<p>I know there is no right answer on this becasue the USNWR data serves multiple purposes. But to me one big value of the data is giving a quick fix on probability of admission and the academic quality of the other students.
If the colleges run indepent decision processes and if the students tend to go to classes primarily with others from the same college then having each college submit separate data to USNWR seems the way to go. My analogy, and I Know it's not perfect, is the Claremont Colleges. It's basically one campus and many students cross register but they run separate admissions offices and most classes are with other members of their college so all the Claremont colleges submit data separately.</p>
<p>That's actually a very good analogy and one that I've considered before. I see the Claremont Colleges, however, more analogous to a state school system, wherein you have a number of independent, self-sufficient schools with varying strengths, weaknesses, and selectivity. </p>
<p>It's a valid argument, but I still have a hard time viewing Columbia, or any other single-administration school, that way. Plus, my classes at Columbia were filled with students ranging from General Studies transfers to dissertation-level PhD students and everything in between (and I was in the Graduate School!). One of my courses even had a random Juilliard student (it was an advanced foreign language course). </p>
<p>I'm not so sure that you get this same overlap at a system like the Claremont Colleges. I'm sure there is some, though.</p>
<p>
[quote]
I don't think it will hurt SLC at all. The type of student who makes decisions primarily on the basis of the rankings isn't the type of student that would fit at SLC. SLC is unique, and it has nothing to do with rankings.
[/quote]
I think a fall off the first page of LAC rankings could cause SLC to be overlooked by some students not familiar with the program, but who would be interested, simply because it is not on the LAC page (I think US News now lists the top 100 on one page). I know that 7 years ago my son's college search began with him requesting that I compile a list for him of "all" the LAC's in the country. Obviously I couldn't list "all" and I had to narrow it down -- but I remember the initial list I gave him included about 75 schools, and he insisted on reading something about every single one. Fortunately we weren't relying on the US News list in any case... but it would have saved me some work if I had known about it. </p>
<p>However, that's why I think a "Could Not Rank" listing would be fine -- it also might be missed, but I think that it is more likely the SLC "types" would look at the "could not rank" list, and presumably all colleges would be on one page.</p>
<p>
So by that measure, Julliard would not be "extremely selective" because they don't focus on GPA and SAT scores? </p>
<p>US News can use whatever methodology it wants, but it can't rewrite the dictionary. A college which turns away a significant number of applicants for everyone it admits is "extremely selective" no matter what standard it uses, as long as it is using some standard relevant to distinguishing among its students. </p>
<p>I'd agree that SLC doesn't fit that definition in any case -- as in recent years it has accepted 45% of its applicants -- I just take issue with the idea that US News can define selectivity by interposing criteria that the college may opt not to use, like test scores or class ranking.</p>
<p>
But that just illustrates how ridiculous the "rankings" are, because Barnard and Columbia students are getting essentially the same educational experience. </p>
<p>There are two significant differences between the colleges, which affect academics:
1) Columbia has a strong core which all students are required to take. Barnard has very broad distribution requirements ("9 Ways of Knowing").
2) Barnard has a stronger faculty advising system; its first year students are able to work directly with faculty members in planning their program, and the faculty are very accessible to the student. </p>
<p>Everything else is the same: Course enrollment is fully integrated; while some department are separate, many departments are integrated and work closely together in shaping course and curriculum; student life is essentially the same with shared athletics, shared clubs & organizations; students at both colleges have equal access to dining facilities, athletic facilities, etc. </p>
<p>So -- while there are subtle differences in the experience of students at each campus, overall the academic and social experience is pretty much the same. If rankings provide information about the quality of the educational experience and/or quality of life, then it's hard to see how two partner colleges with shared resources could be "ranked" much differently from one another. If Columbia is a top 10 college, then Barnard should be very near to that. (I can see an argument that the core makes Columbia's academic program a tad stronger, so a few ranking points differential might be supportable.... but it makes no sense that students taking the same courses from the same profs at the same physical facilities in the same city can be attending at top 10 university on one list, and below the top 25 on another).</p>
<p>This never made any sense to me, either (Barnard's ranking). </p>
<p>Isn't the bottom line here supposed to be <em>quality of education</em>? </p>
<p>The academic experience for Barnard students is indeed almost indistinguishable on a day-to-day basis from that of Columbia students. They enjoy the same access to faculty as well as to libraries, research facilities, and other resources. </p>
<p>I would guess that Barnard's faculty resources score would be lower than Columbia's, because only Barnard faculty (who also teach Columbia courses) are included, whereas Columbia's faculty resources score reflects the strength of the entire university. It's still one of the most selective LACs--more selective than schools like Grinnell, Wellesley, Macalaster, and Carleton--despite the limited applicant pool.</p>
<p>Columbia's handling of the rankings is completely unethical, in my view. This is a stark indicator of that.</p>
<p>
[quote]
US News can use whatever methodology it wants, but it can't rewrite the dictionary.
[/quote]
</p>
<p>Calmom, that is why I added the caveat in my first sentence, "Unfortunately, when it comes to USNEWS, this is mixing apples and oranges.</p>
<p>Since this thread was about the USNews rankings, I solely looked at the definition used by USNews, when I wrote, "The selectivity index comprises three components: admission ratio, percentage of top 10%, and SAT scores. In the USNews report, SLC's selectivity index is absolutely correct." </p>
<p>Lastly, I also added a final caveat in the form of "I understand that the term "selectivity" might not please everyone, but the USNEWS uses it to clearly represent ONE thing: the selectivity of admissions."</p>
<p>It is a given that the USNews uses terms that are questionable, starting with the ridiculous "Best" colleges, the intimation that their methodology has become a magic measuring stick for ... quality. </p>
<p>However, can we really blame USNews for its choice of the term "selectivity," especially since it really means "the state or quality of being selective?" So, how could we define selective differently from "Of or characterized by selection, discriminating, empowered or tending to select" or "tending to select; characterized by careful choice; characterized by very careful or fastidious selection as in "the school was very selective in its admissions." </p>
<p>Maybe we all read way too much in a simple word.</p>
<p>
[quote]
The academic experience for Barnard students is indeed almost indistinguishable on a day-to-day basis from that of Columbia students. They enjoy the same access to faculty as well as to libraries, research facilities, and other resources.
[/quote]
</p>
<p>It seems that the fact that Barnard is compared to its peers in the LAC category costs it a good number of points, because the rankings are a lot less forgiving when it comes to faculty resources, class sizes, etc. </p>
<p>It so happens that classes over 50 students are not so common in LAC's; hence Barnard earns a lower comparative rank. Barnard would probably earn a better ranking among the national universities. </p>
<p>I'm not sure why this is so surprising!</p>
<p>It's USNews' right to take a bunch of numbers (admit rate, SAT scores, % from top 10%), plug them into an arbitrary formula, and call the result X. It looks better in a magazine if they call it "selectivity." It's up to the <em>reader</em> to decide how well USNews' definition corresponds to his/her own definition of the word.</p>
<p>Xiggi,</p>
<p>Your post--which states the truth--is just more support for the notion that the ranking system is flawed. </p>
<p>This segways dangerously into the Barnard/Columbia debate, but this also supports the case for Columbia to just swallow its pride and absorb Barnard as an undergraduate college, because it really doesn't fit into either major category. I understand that it's not as simple as that, but boiling it down to its basic parts, it seems like the most logical course of action. </p>
<p>O.k., I'll step off of my soap box now, as I've probably said too much already.</p>
<p>I don't want to revive the Barnard/Columbia debate either -- Barnard does not want to be "absorbed" into Columbia in any case -- the schools have a unique partnership that is mutually beneficial. But the point is that the "rank" of either is quite arbitrary. </p>
<p>The GS population is also unique -- I think that Columbia is the only Ivy-caliber school that offers a selective program for nontraditional students that is so strongly integrated into it undergraduate program -- it really is quite different than the Harvard extension. (For what it's worth, my d. says that the smartest students in her classes are the GS students; of course those students are coming in with greater life experience and maturity ... an SAT test taken at age 17 probably doesn't say much one way or another about the abilities of a 25-year-old). </p>
<p>I don't think any of the Columbia U schools should be "absorbed" into another -- I think one of the very special qualities about Columbia is that it offers a lot of alternatives and that its courses are not only open to all the students enrolled at its various schools, but that it also is extremely flexible about enrollment. Other than normal prerequisites, courses at all levels are generally open to students at all levels --and I think that benefits all the students there, simple because of a broader exposure to ideas when the range of ages & experiences of the students is a little wider. </p>
<p>But this is entirely OT to the rankings... so I'll leave it off for now. The main point is that the rankings are arbitrary and don't tell much about the school.</p>