Rigorous small LAC near/in a city

<p>I have to say, I think Barnard would be perfect (I may be biased, applying there ED), but I wanted a school in a city, but one with a campus. Barnard is not number oriented at all, and essays are much more important. Its median SAT is only about a 2050, so a 198 PSAT is right around there. I had a 204 PSAT in junior year and ended up with a 2300 SAT, so you never know! </p>

<p>As more of a safety/match, Providence College might be good, I know someone who goes there and she loves it, can’t say enough good things.</p>

<p>Thanks everyone! Barnard is on the list, it’s actually less interesting to her than NYU Gallatin at this point (maybe because Grandma was a Barnard girl and talks about it nonstop) but I’d be thrilled to have her go there. (And good luck, Treehugga, awesome school!)</p>

<p>TK, I liked your reasoning and I did find that top-100 LAC list to be interesting if only as a starting point.</p>

<p>^ That’s all it is, a starting point. annasdad frequently comments that all rankings (USNWR in particular) are meaningless and arbitrary. I appreciate his skepticism, but in my opinion he’s being overly cynical. US News uses criteria (like class size, faculty compensation, and student qualifications) that aren’t meaningless. The weights are of course somewhat arbitrary, so it does not make sense to use it as a decision-making tool. I’d say the LACs it ranks in the 30s for example (Kenyon, Trinity, Whitman, etc.) are in fact good but less selective alternatives to Amherst or Haverford.</p>

<p>None of those criteria except class size have any evidence to support the allegation that they have ANYTHING to do with the quality of undergraduate education at a college. And, as has been discussed many times on CC, the measurement and reporting of class size make even it effectively meaningless.</p>

<p>Again, here’s my evidence:
if the USNWR criteria were completely meaningless and arbitrary, then they wouldn’t be replicable. We’d see open-admission schools we’ve never heard of jumbled up randomly with the most competitive and popular schools. But in fact, there is considerable overlap between the top schools chosen by USNWR and the top schools chosen by other rankings, using different criteria. That was my point in the exercise above using the tool you recommended. I could easily replicate more than half the schools in the USNWR LAC list. The 7 that did not intersect were, from my perspective, inappropriate picks. It missed many appropriate picks that the USNWR list contains. So, the USNWR LAC rankings would have been a better starting point for my hypothetical searcher. Your mileage may vary (especially if you emphasize highly personal “fit” criteria.)</p>

<p>Here’s how I think the USNWR system works. Mr. Morse started with a pre-conceived notion of what he thinks the “best” schools are (the Ivies, the Little Three … all the usual suspects.) Then he tried to come up with a basket of objective metrics that would accurately reproduce that set of schools, plus similar schools. He came up with criteria that do just that. Is this not just reinforcing his own tastes and biases? Yes, it is. But, the fact is that many people share the same tastes and biases. </p>

<p>The same modeling process is repeatable to choosing other things besides colleges, with the same strengths and weaknesses. The choice of criteria cannot be precisely scientific because that is not the nature of the choice. It involves human judgement. In my judgement, faculty compensation, endowment size, etc., all have some correlation with college quality. I don’t know how to prove that … and I admit, if you put Socrates at one end of a log with any willing, well-prepared student, you may well get an excellent learning experience despite what the USNWR metrics would tell you. Unfortunately, I don’t know how to find Socrates in the haystack of 3000+ colleges and universities.</p>

<p>I second the Macalester and Barnard suggestions. I also agree with tk21769. If the USNWR was so arbitrary, then they wouldn’t match up so well with almost every other statistical study of American universities out there. If the WSJ Feeder Rankings, USNWR, Times Higher Education, the NRC and Payscale all agree that Harvard is top 5 at the very least in their rankings, its almost statistically impossible that Harvard isn’t a top 5 American university.</p>

<p>You essentially have a multitude of surveys all coming up with the same conclusion about the pecking order of American universities. They can’t all be lying.</p>

<p>

</p>

<p>Ah yes, statistics.</p>

<p>There’s an old saying - perhaps you’ve heard it - “garbage in, garbage out.” If the criteria used to establish the rankings are basically the same parameters, and are mostly garbage, then they can agree with each other 'til the cows come home, and all you have is garbage.</p>

<p>If you want to base your college decisions on what high school guidance counselors think are the best colleges, that is, of course, your business.</p>

<p>Someone mentioned Reed before, and I’d like to second that recommendation. It’s really not far from the city (within biking distance. Of course, it IS in Portland), but far enough away that you can still appreciate the beauty of the campus. Has maybe a 40% acceptance rate, but that’s mostly because it’s self-selective.</p>

<p>OP, if your D is at all interested in a Great Books curriculum, there’s St. Johns in Annapolis. Also in the Baltimore area, though not as rigorous as the schools you’ve already listed, is Goucher.</p>

<p>

</p>

<p>To some extent they are similar. In that case, you have to ask why different ranking designers (and apparently, consumers) agree that those criteria are significant. Of course it’s always possible that the crowd is wrong. However, I think the burden is on the naysayers to show it is possible to build and run high-quality colleges without small classes, well-paid faculty with advance degrees, selective admissions, and high graduation rates.</p>

<p>Besides, there are some very significant differences among rankings. The Washington Monthly criteria are very different than the USNWR criteria. As a result, they rank some schools (like Berea, Morehouse, and Spelman) much higher than USNWR, due to WM’s emphasis on “social mobility” and “service” (and zero emphasis on selectivity). Nevertheless, there is still considerable overlap. Both rankings rate Swarthmore, Amherst, Pomona, Williams, Harvey Mudd, Wesleyan, Carleton, and Haverford very highly.</p>

<p>Why, if they are applying such different yet equally meaningless and arbitrary criteria? I think it comes down to money. Whatever your standard of excellence, to run a successful college at any significant scale usually takes money. The richest schools have the wherewithall to deliver an excellent product, somewhat regardless of how you measure quality.</p>

<p>

</p>

<p>Because (1) they make money selling rankings (or publications that publish rankings) and (2) there is no valid way to rank colleges by quality of education delivered, at least not with current publicly available data, and (3) they therefore pick things that sound good.</p>

<p>

</p>

<p>If that money is being spent on undergraduate education. Know any rankings that measure that?</p>

<p>Are you suggesting that the rankers simply toss in some famous names at the top to make their results seem plausible? If that were true, then their results, using their formulas, would not be repeatable. But they are. </p>

<p>

</p>

<p>There’s a problem of definition here. What do we count as “money spent on undergraduate education”? Financial aid? Building a new science center? Hiring a gourmet food service to deliver woodfired pizza and sushi? The richest schools can afford to do it all, but they wouldn’t stay rich for too many years if they did not spend money in ways perceived as beneficial to undergraduate education. </p>

<p>USNWR’s “faculty resources” is one indicator of institutional wealth. It covers faculty compensation, percent of faculty that is full time, percent of faculty with terminal degrees, class size, and student-faculty ratio. It is expensive to perform well against these measures. Now, there is no guarantee that (in the context of a research university not a LAC) all such investment directly and fully benefits undergraduates. On balance, I think it is likely to raise the overall quality of research and instruction. Certainly, spending less on these factors is not a plausible formula for excellence. You could choose various other proxies (such as endowment per student, library size, or per capita research expenditures) and still predict similar sets of top-ranking schools.</p>

<p>Anyway, I do hate to sound like a shill for some magazine. For those readers who share annasdad’s skepticism about rankings, maybe it would be helpful to suggest other resources. One alterative to metrics-driven rankings is a narrative guide. One I happen to like is Loren Pope’s *Colleges That Change Lives<a href=“the%20book%20and%20web-site”>/i</a>. It lists many similar but less selective alternatives to schools like Amherst and Haverford.</p>

<p>Many of the CTCL colleges, by the way, coincide with schools on the USNWR LAC list, #50-100 or so :)</p>

<p>Appreciate both views here–and my d is using the info to compile a nice list! Thanks to all.</p>