<p>*“What we found is that these rankings are kind of arbitrary,” Pachter says. “If you care more about student-faculty ratios than alumni giving, you’re going to get a different ranking. It’s very biased to give only one view.” The pair argue that the magazine should release several different rankings, based on choices of a few representative sets of priorities.</p>
<p>“But that doesn’t sell magazines,” says Kevin Rask, an economist at Wake Forest University in Winston-Salem, N.C., who has studied the impact of the U.S. News rankings. “People want to see who’s Number One and who’s Number Two, we want to see who’s going up and who’s going down.” The study shows nicely, he says, how that interest can be at odds with a true evaluation of quality.*</p>
<p>There is no way for anyone here to talk about the actual quality of professors once the door is closed and the class goes on. Far too many individual professors, far too many different schools to compare, and no one who has sat in enough of these rooms to make any sort of even qualitative guess as to the actual quality of teaching.</p>
<p>Really, the best you can get is a series of questions relating mostly to policy. Which public schools have demonstrated unparalleled focus on undergraduate teaching and on improving undergraduate teaching? Where does undergraduate instruction, and instruction in general, mean the most for tenure? Where is there an abundance of support for teachers throughout their careers as they develop? Where are faculty mentorship programs used effectively to help younger professors become acculturated and to give feedback on teaching? Where have resources been allocated for curricular and pedagogical development? How do students feel about their own learning experiences (a loaded question because student population's expectations are not consistent and are partially formed from their experience)? How is student feedback collected and how does the university respond to this feedback?</p>
<p>The only thing that can be objectively discussed is commitment to strong undergraduate instruction, because truthfully, after all of these things, you still may not be able to attract the best teachers or "create" the best teachers. For ten years you may be stellar in this area and then new hires and retiring patterns results in a great decline. You may have more excellent teachers than anyone else, but more horrific ones as well. It's really impossible to measure actual teaching quality objectively across such a huge system with any data currently available.</p>
<p>EDIT:</p>
<p>I don't want to get into the "value of rankings" discussion (none, in almost all cases), but it's ridiculous to think that the USNWR has figured out an end-all, meaningful formula to assess college/university quality. Most especially because their model was specifically designed to produce results which by-in-large match popular perception, followed by the ability to tinker so that there can be slight changes every year no matter what so more people will buy the magazine. The problem with rankings in general, and especially commercial rankings, comes well before all of the major methodological issues. The first hurdle, and perhaps the most daunting, is realizing that no ranking will be accepted with any kind of authority by the general public unless it closely aligns with existing perception. The validity of rankings is constantly challenged by the layperson who thinks, "Well, if HYPSM are not at the top, they must be measuring things wrong. They're unquestionable recognized as the best." Whether they are or aren't, I have no idea. I only have experience with a limited number of schools, but right away, the rankings are always going to be flawed when they're being created for a reader who will compare perception to the results and expect a reasonable match to confirm the methodology's merit.</p>
<p>How about you can't learn how to be a great teacher? I've had great, amazing teachers in my life and none of them had to get their degree at fancy, expensive private schools or even big state flagships.</p>
<p>90% of being a good teacher is about having a natural ability to teach, along with being able to react appropriately to sudden changes without really thinking. You must also have good, caring heart that will make it "not just a job, but a mission." The worst teachers are like the worst nurses; they only do it to have a job. </p>
<p>The best indication of quality of teaching is the US News over- and under-performance graduation rate percent. If students don't like the teaching, they leave in numbers greater than you would expect given the quality of the student body.</p>
<p>90% of being a good teacher is about having a natural ability to teach, along with being able to react appropriately to sudden changes without really thinking. You must also have good, caring heart that will make it "not just a job, but a mission." The worst teachers are like the worst nurses; they only do it to have a job.</p>
<p>Excellent article about that truth. All of these reasons are why quality cannot be measured in any dependable way and is difficult to either qualify or quantify-- we know good teaching when we see it but it's impossible to see enough of teaching at enough places to make any meaningful comparisons. The best you can do is look for where there is an environment and culture of support, a place with qualities that make it where good teachers would want to be.</p>
<p>
</p>
<p>Nonsense. There are so many other things that factor into these numbers to a greater degree that has nothing to do with quality instruction that this comparison would be specious at best.</p>
<p>modestmelody-
The over-under performance is an indication of how well the school does with the students it gets. Most of that has to be the faculty. Some of it might be grading standards or difficulty of the curriculum but that is also faculty-related.</p>
<p>Sure, some students might leave because they don't like the dorms but I don't think that is a major factor.</p>
<p>
[quote]
The best indication of quality of teaching is the US News over- and under-performance graduation rate percent.
[/quote]
</p>
<p>Graduation rate is highly dependent upon what majors your undergrads choose. Humanities majors are much more likely to graduate than engineering, for example.</p>
<p>Majors, study abroad, policies about leave, internship and co-op policies, ease of switching majors, curricular difficulties, financial support, fifth year programs that delay graduation but not in a negative way, difference in expectations versus reality, number of older students, supply of night classes for people who have to work, medical leave policies...</p>
<p>There are about a billion things that graduation rate X years out cannot really account for. Teaching quality is far from the deciding factor in people staying or leaving.</p>
<p>But, modestmelody, why would those factors you named not average out among schools? Why would one school have +5 overperformance and another school -5 underperformance? Academic issues are the number one reason for dropping out by far.</p>
<p>Perhaps the most important point is that almost all schools fall within +5 to -5 percent of expectation. That tells me that the quality of teaching is not bad enough anywhere to make more than a 10% swing in graduation rate (at most). And, the percent variation in graduation rate due to quality of instruction is probably less than 10% because other things affect grad rate, as modestmelody points out.</p>
<p>Collegehelp, why would they average out? That doesn't make any sense to me. You're making an assumption I'm not seeing the basis for at all. Why would those things not all effect an individual as much if not more than teaching? Where is the evidence that people are leaving because they think the teachers are bad? If academic reasons are somewhere cited as most common, why is it not the graduation requirements, flexibility of course scheduling, or even that college is too hard rather than the teachers are not good enough that are the reasons that make students leave?</p>
<p>I don't see where you can leap to not graduating means teachers are bad and graduating means teachers are good. That's a ridiculous simplification as far as I can tell.</p>
<p>I'm not sure how this question can be answered unless the data is collected from students who took similar classes at two different schools. Even then how would a student who transferred from say, Penn St. to Colgate, be able to compare the intro classes taken at Penn St. to the upper division classes taken at Colgate?</p>
<p>You'd really have to have students who took a few intro, intermediate and advanced classes in the same subject area at two different schools.</p>
<p>I agree with modestmelody on the over under issue. State schools typically have a much higher % of working students, which affects graduation percentage 10x as much as teacher quality, by my estimation. Then grading policies. When Brown's <em>average</em> gpa is 3.55, and Berkeley's is 3.2, you know there's a huge cohort at Berkeley struggling at 2.1 where Brown's struggling students are at 2.5. Who is going to quit? Does that have anything to do with the quality of the classroom teaching experience?</p>
<p>nysmile -- do you mean "teachers college"? I think that refers to non-research oriented public universities under state charter to produce teachers for their schools. could be wrong about that, but I thought in CA the Cal State universities had this mission, whereas the UCs had the mission of research and leadership.</p>
<p>DunninLA-
US News actually uses a separate formula for public universities, private universities, and LACs when calculating their over- and underperformance numbers. </p>
<p>modestmelody-
It actually is pretty simple:
college = students + teachers
college - students = teachers
The over- underperformance number subtracts students from the formula.</p>
<p>If someone told me they were having trouble scheduling a course and so they were going to drop out, I would think it was BS and that there was some other real reason. Same goes for most of the other things you named.</p>
<p>Faculty affect student satisfaction more than any other factor, aside from characteristics of the students themselves.</p>
<p>The point I am trying to make is that the quality of instruction is probably best where the students are the best and, after you control for student quality, the quality of instruction is about the same everywhere.</p>
<p>DunninLA--you're correct. Teacher's College or Normal School. The SUNY schools are still very popular with New York kids who want to go into education. They offer a 5-year BA/Master's Program in Education.</p>
<p>I categorically disagree with this assertion, but I've listed all the factors I think would effect these numbers and I'll let others decide how they fall on the issue.</p>
<p>DunninLA is correct about working students, although on the GPA argument, I think there is far, far more nuance. For instance, Brown not having +/- tends to be a big part of the grade inflation here-- B+ and C+ are often bumped up depending on how the distribution is around that student's grades.</p>