<p>hawkette vs. hawkette</p>
<p>
It’s interesting that you choose to ignore the complete lack of objective basis for USNWR’s 1995 teaching excellence ranking, and incorporate it with 25% weight in what you call “Ranking What Counts”.</p>
<p>hawkette vs. hawkette</p>
<p>
It’s interesting that you choose to ignore the complete lack of objective basis for USNWR’s 1995 teaching excellence ranking, and incorporate it with 25% weight in what you call “Ranking What Counts”.</p>
<p>I’ve taken same multiple classes before at Harvard, MIT, and Johns Hopkins.</p>
<p>I mean, I can see why economics classes might be better at Harvard or MIT… but on the whole, at least the classes that I’ve experience have been pretty much the same… Harvard actually videotapes their Java programming lectures while MIT does not. Calculus 1/2 and Linear algebra at Harvard and Hopkins are the same (they use the same Stewart and Otto Bretscher textbooks) and Java programming is the same at Harvard and MIT… Art of the Ancient Americas at Hopkins was only slightly different and IMHO equal to the Spanish American Art and Architecture; European and Islamic elements at Harvard… I’ve never taken Multivariable at MIT or Hopkins (YET!) but I have at Harvard. I’ll be retaking it since I do not want to transfer the credits.</p>
<p>Based on my personal experience… Harvard shouldn’t be ranked that much higher than Hopkins or MIT. The quality of instruction between the three university are minimal and indistinguishable at best. Why? Nobody cares about undergraduate education at major research universities! I once sat in a class with Gregory Mankiw (EC 10) with some other famous Harvard professor. It was amazing compared to Macroeconomics with Louis Maccini at JHU. I have never taken an economics class at MIT so I am unable to comment on MIT’s economics stuff…</p>
<p>Undergraduate education across all the major research universities are regurgitation from the textbooks.</p>
<p>Hawkette, unless statistical data provided by universities that measure the 4 magical criteria you insist on being the most important to determining quality of undergraduate education are adjusted and adequately weighed, they is meaningless. As of now, such data has never been adequately cleaned and as such, is completely useless.</p>
<p>Another thing Hawkette, if you insist use the commitment to teaching rating used by the USNWR back in 1995, you must by definition agree with the peer assessment rating, since it is the same people who were responsible for both. The commitment to teach rating supposedly measures commitment to instruction whereas the peer assessment score supposedly measures quality of overall undergraduate education. Both concepts are nebulous to be sure, but there you have it.</p>
<p>I think it’s a good idea to try and create rankings (for undergraduates) based on the actual undergraduate education. Unfortunately, as mentioned already, most of the data for teaching quality at the undergrad level is out of date. I also agree that there should be less focus on graduation and retention rates and more on faculty and possibly selectivity. </p>
<p>Hawkette, I appreciate your opinions on this board. It’s nice once in a while to read threads that don’t just talk about how Hopkins is actually as good as HYPSM (and it might be!) or how UC Berkeley deserves a higher ranking because its public status has held it down (and it does!), and yes I chose Hopkins and Berkeley because I know Phead and UCBchemE love to talk about their respective schools :P. </p>
<p>I also agree with a post about how it’s almost useless to try and rank these schools. I’m sure we can all agree that all of the ones listed here are the very best (in all fields!) and that trying to determine which one has a better engineering program is almost futile. Whenever you find yourself arguing why Hopkins deserves this ranking, or Berkeley that ranking, just remind yourself how lucky you are to have the privilege to attend such a school and how many thousands of others would love to take your place in a heartbeat.</p>
<p>In 1996/7:</p>
<p>-Computers ran Windows 95 (and Mac OS 7) at a blazing 200mhz
-Cell phones were huge, heavy, and expensive
-We used CDs and Walkmen (CD burning was still a few years out, let alone iPods)
-Using AOL was socially acceptable
-The high school class of 2003 was on the verge of puberty
-Bill Clinton was reelected President</p>
<p>-And US News made the survey that is being cited now.</p>
<p>Things change.</p>
<p>midatlmom,
Thank you for your post and for your recognition that there are more than a few snags in the use of subjective assessments and that all present problems in their use. That has been my longstanding contention and a major reason for my ongoing disapproval of the Peer Assessment rankings and argument for separating this out from the USNWR rankings. </p>
<p>Thank you also for appreciating and documenting that I make a fair presentation of the Teaching results, even pointing out its shortcomings. I know that it is not perfect and outdated. I encourage students to use it as a starting point in their investigations of their prospective colleges and to use it as one of many sources in looking at how important teaching is to an institution vis-</p>
<p>^ Hawkette, your rankings seem to contradict your personal beliefs that undergrad experience is much more than goes on inside the classroom. Given your love of sports, do you really think schools such as Yeshiva, Rochester and NYU provide a better undergrad experience than Michigan, or other publics at the bottom of your list.</p>
<p>I know your response will be to list your favorite publics and privates…but these threads are deja vu.</p>
<p>Your rankings don’t support your previous statements. How can you stand by this ranking?</p>
<p>here-to-help,
In answer to your question about institutional resources and the availability of state funds to the publics, I think that the size and all of the sources of the financial resources are important, particularly as it relates to their sustainability. However, I think even more telling is how an institution chooses to spend its capital, regardless of where it came from. That often tells me more about the institution’s priorities and the actual impact that the institution’s prioritization will have on the typical undergraduate student.</p>
<p>Consider a typical college—it could be public or private–and ask how that school uses its resources. Then compare the answers with schools said to be in its peer group. Starting questions might include:
<p>One can think of many more questions, but I would hope that you would agree that the answers to these questions would be indications of how strongly a college will dedicate its financial resources to assist undergraduates. If you do a little digging, I know that you will find that there are differences among even the most highly-ranked colleges… and sometimes the differences are sharp and highly consequential for the prospective undergraduate.</p>
<p>don’t you guys get it, she doesn’t care if it doesn’t make any sense, she’s just going to let this one die, and create another thread a week later tweaking a few percentages… how many of these ranking threads have we seen from hawkette?</p>
<p>Ok, I was worried you were just measuring financial resources in the bank, rather than where money is spent.</p>
<p>^ Hawkette is just using USNWR’s financial resources rank. She didn’t produce her own numbers for this purpose. USNWR financial resources rank is flawed since it takes into consideration spending per student… which seems natural, but it doesn’t attempt to separate graduate student spending. This measure favors campuses with medical schools. Also, this measure rewards frivolous spending. </p>
<p>There is a misconception in America that we need to throw more money at our problems - ie. education and health care.</p>
<p>Indeed =
That’s not good at all.</p>
<p>kb,
Thanks for the constructive response. :rolleyes:</p>
<p>I encourage you to open your mind to this whole ranking discussion. A ranking is only as good as its methodology and there is never going to be a single ranking that is going to be the appropriate measuring stick for all students. By subtracting the Peer Assessment scoring and adding the Teaching scoring, I am hoping to spur discussion and thought about the actual experience that an UNDERGRADUATE will have at a college’s campus. My new “ranking” above is not automatically better or worse than the original USNWR, but for many students, it might be a more pertinent listing as it relates to the undergraduate environment that they will encounter. </p>
<p>As for the use of resources, I agree with ucb that the Financial Resources is a flawed measurement and not always the best indication of how a college’s resources are used to assist undergrads. I’d love to hear comment from you or others about how to best measure a college’s institutional resources and how they spend those resources for the explicit benefit of undergrads.</p>
<p>Personally, I’d be more interested in a value rating. Perhaps take the number of academic programs a university offers that are Rated #1-10 and then divide the total by the average out-of-pocket cost for undergraduates (for publics I guess you can use OOS cost). :)</p>
<p>^ And I wonder which school would come out on top in that ranking? I’m sure you have aboslutely no idea. :p</p>
<p>^ It’d be close - hard to say if we’d beat Stanford.</p>
<p>45, I’m sure Penn and Texas would do well too. ;)</p>
<p>Hawkette</p>
<p>I am somewhat upset by your willful misrepresentation of my position. For the record, here is what I actually believe</p>
<ol>
<li><p>I believe that your position on the teaching excellence and peer assessment surveys is inconsistent and illogical. The same academics are being questioned in both surveys–either they are knowledgeable or they aren’t. You attempt to have it both ways, i.e. they know what they’re talking about when it comes to teaching excellence, but they don’t know what they’re talking about when it comes to overall academic quality. </p></li>
<li><p>I have no particular problem with the Peer Assessment survey. The reason that I don’t use the Teaching Excellence survey is that it is outdated and I believe that the current PA surveys cover the topic. As USNWR notes in its discussion of factors
</p></li>
<li><p>Far from “not insisting on the use of the subjective Teaching survey results”, you insist on using these results all the time. By my count you have started approximately 200 threads, with a vast number of them attempting to rerank schools. In addition to this thread, you have included the teaching excellence results numerous times. For example, on 1/5/09 you noted
However, in your 7/21/08 thread entitled “USNWR Rankings Adjusted for Teaching Excellence”, you went back to the 25% formulation for teaching excellence. On 6/18/2008, you used the teaching excellence numbers in your thread entitled “Among the USNWR Top 30, which are the most undergraduate-friendly colleges?” And let’s not forget your 10/17/07 thread entitled “Choosing a College for Classroom Teaching Excellence: Can we judge quality in 2008?” There have been numerous other times you have used these survey results as the basis for your arguments (but I’m not going to check all of them, because my eyes start to glaze over).</p></li>
</ol>
<p>
</p>
<p>1995?? </p>
<p>Is it reliable to use data collected about 14 years ago to rank the schools today?</p>
<p>Its sort of like going to the zoo and thinking you are in the wild. You aren’t. The rankings are much the same. Good for gazing and contemplating and superficial gatherings. But not reality.</p>