Well, OK. But if many measurements (from many data sources) all indicate high academic quality, and if those measurements seem to corroborate not only each other but also one’s own personal experience (as well as other anecdotal reports and professional peer assessments), then I think the burden of proof begins to shift to the critics. So far, @nrtlax33, your best counter-evidence seems to be the WSJ/THE “engagement” scores. I’m a little skeptical of those engagement scores, for reasons I suggested in post #477. WSJ/THE evidently generates a score based on as few as 50 responses. You yourself took issue with the Chicago Maroon article I cited above because it relied on only 544 responses. Also, my impression is that the questions are poorly framed. If a Dordt College student and a Harvard student both claim their colleges promote “critical thinking”, how do we know they’re both honestly applying the same standards?
If we’re trying to measure (per WSJ/THE) “the extent to which students immerse themselves in the intellectual and social life of their college", then I have to say, I’m surprised UChicago (along with Williams, Princeton, Kenyon, Bryn Mawr, William & Mary, Amherst, Caltech, Grinnell …) would get a low score (especially w.r.t. the “intellectual” life). For something like professor-student meetings outside of class, then yes, I can believe UChicago might score low. As for quality of life issues, USNWR doesn’t really even purport to measure them, but I’ll concede that most HS students would prefer the QOL at many other colleges.
@tk21769 : I believe the beauty of WSJ/THE’s rankings is that if you suspect some schools are playing games, you can do your own research and make your own judgment. USNWR is like a religion. Soup Nazi… Schools … you will be punished if you don’t like it. They do not make data available for inspections, instead they give you unverifiable results. This might a total fraud from day one. You just don’t know. BTW, @tk21769 I saw someone posted a video on YouTube saying she is stuDYING there. I thought it is pretty funny.
@nrtlax33
Thank you for the link. While some of the criticism in this 2000 article is still valid, many points in it have been addressed or simply no longer apply
My main criticism with this and many other rankings is that they all seem to measure how closely schools resemble those that many believe should come at the top.
LACs are strong in undergraduate teaching but they are also quite small. If a school has three more times smart students, the campus conversation is more likely to be more vibrant. There is really no “perfect” school out there. It is very important to weight each person’s priorities and seek balance. WSJ/THE did a survey without the preset conclusion like what USNWR did. You might not like the results but that is what it is. I suspect in the future some schools might start trying to game the surveys. But the firm they are using is big-name player in the field, they should be aware of this possibility.
I have to say, I’m surprised UChicago (along with Williams, Princeton, Kenyon, Bryn Mawr, William & Mary, Amherst, Caltech, Grinnell …) would get a low score (especially w.r.t. the “intellectual” life).
I still don’t get how this is easier to do with the WSJ/THE rankings (or supporting data) than with the USNWR rankings (or supporting data). Suppose for example you suspect that some schools are playing games with their S:F ratios (which is one of the numbers USNWR uses). You can go into the Common Data Sets, section I, and examine whether a college is or is not including graduate students in the “S”. If you decide (as I have) that there are significant discrepancies, then you can estimate (from the factor weightings) how much this might be affecting the overall ranking. Then you have to decide whether an issue like this poisons the well for all surrounding data.
As for the WSJ/THE expenditure numbers, for any given college(s), I can go into IPEDS and pull Instructional Spending per FTE Student. But then I’m pretty much at a dead end. I would not know how to verify whether $85K is a reliable number for UChicago or $111K is reliable for Yale. About the best I know how to do is to look at some of the cost drivers (such as class sizes and faculty salaries) and decide whether it’s plausible that these schools have much higher ISPS numbers than some other colleges.
I believe (maybe naively) that it is in a college’s self interest (properly understood) to compile accurate, appropriate performance metrics. Lord Kelvin: “If you can’t measure it, you can’t improve it.” Although, any particular number might be subject to confounding factors, errors, or misinterpreted instructions. Yes, sometimes it might even result from outright fraud.
^Unfortunately, common data sets are not public unless a college publishes it. The rankings companies should make all the underlying data available for inspection.
IPEDS is public, but they don’t have all the data that the CDS does.
They gathered responses from as few as 50 students per college. Apparently each answer consists of a 1-10 rating for each of these 4 questions:
to what extent does the student’s college or university support critical thinking? For example, developing new concepts or evaluating different points of view;
to what extent does the teaching support reflection on, or making connections among, the things that the student has learned? For example, combining ideas from different lessons to complete a task;
to what extent does the teaching support apply the student’s learning to the real world? For example, taking study excursions to see concepts in action;
to what extent did the classes taken in college challenge the student? For example, presenting new ways of thinking to challenge assumptions or values
These questions strike me as completely subjective, and probably impossible to normalize across colleges.
How many students simply gave a “10” response to every question?
I think there’s evidence of the existence of what I’ll call “Nondorf’s Law”: as the length of a CC thread increases, the probability of UChicago being mentioned (usually in a derogatory manner) approaches one.
suzyQ7 , sometimes information about a school is available but not obviously so. Once you are in the web site of a school you are interested in, use the search field if necessary (although most schools do list this as a department or link), to retrieve “Institutional Research”. That is usually a gold mine in terms of accessing more information about a school. There you will see things like the school’s accreditation reports, sometimes even the reviews and their replies, more specific information that you might not access in other places. This is especially true for schools that are data driven-you can often find things like admissions rates by planned majors. It is often a great resource.
“I think there’s evidence of the existence of what I’ll call “Nondorf’s Law”: as the length of a CC thread increases, the probability of UChicago being mentioned (usually in a derogatory manner) approaches one.”
@tk21769 CLEARLY NSSE’s survey is better than WSJ’s, and it’s actually got a huge and comprehensive dataset. But for their pretty good reasons, will not be used in any rankings.
There are other student surveys besides the WSJ/THE and NSSE surveys.
Niche has one.
For UChicago,
94% of responding students “say professors are passionate about the topics they teach”;
90% “say professors care about their students’ success”;
85% “say professors are engaging and easy to understand”;
90% “agree professors are approachable and helpful when needed”
87% “agree that professors put a lot of effort into teaching their classes.”
68% “agree that it is easy to get the classes they want”
30% “agree that the workload is easy to manage”
Response numbers for the above topics range from 68 to 76.
That’s not a lot (but is more than the WSJ/THE minimums).
The number of responses and the percentages are roughly similar to Brown’s.
I have the same issues with this survey as I do with the WSJ/THE survey questions.
When two different students say professors are engaging/approachable/passionate, how do we know they both mean the same thing and apply the same standards? The survey concepts need to be operationalized and not call for speculation.
Nevertheless I think this kind of information, in principle, is a good complement to what USNWR and Forbes track.
Some of the questions they ask are like “If you could start over, would you still choose this college?” “Does your college provide an environment where you feel you are surrounded by exceptional students who inspire and motivate you?” “Do you think your college will be worth what you and your family are paying?” etc.
I do believe most students are able to adjust their expectation and survive the environment they are in. **Those who do not realize that happiness is a choice and not a consequence of circumstance are hopping onto a roller coaster that they will never get off. **
@tk21769 Unfortunately, I don’t think there is much credit given in university rankings to schools that actually do emphasize undergraduate teaching and education. USNWR has had the teaching ranking, and as I recall it, a number of the schools mentioned are ones I might cite, but recently some have crept on that makes me wonder. That aside, that is a separate ranking and not used in the calculation of national university rankings. Part of it almost seems like rating a car based on size of the engine, etc., without seeing what it is like on the road. I understand the WSJ/THE includes some survey data, but I have not gone through it in detail. Some of it seems to be out of line for the schools mentioned (UChicago, Williams, Princeton, Kenyon, Bryn Mawr, William & Mary, Amherst, Caltech, Grinnell) compared to what you see in Princeton review and Niche, which are also based on surveys.