<p>"A's don't require a 90% anymore? Grade inflation?"</p>
<p>I attended UVa and University of Florida as an undergrad and MIT in grad school about 20 to 25 years ago, and even then, in math, engineering and science courses, an A was usually much lower than 90 to 100. And as a faculty member at Michigan in the early 1990's, I tried to design tests so the average was about 60 to 70 percent. This doesn't mean that average students only learned about 60 to 70 percent of the important material, but rather that I had designed a difficult test so the top students would not be killed by minor math errors and would be rewarded by answering some questions on the subtle aspects of the material. At UofM I was instructed for my course to have an average grade of C+/B-, which was easily achieved using a curve. I always wanted to write tests that were relevant to the latest news (e.g. , engineering problems relevant to the Persian Gulf War), and it would have been too difficult to come up with new exams that were exactly scaled properly and allowed the students to see the relevance of what they were learning. Class averages varied from about 60 to 75 on my exams, and frequently I would have one student who had at or near 100 percent on each test, but students knew they would be primarily ranked relative to their peers. If I sensed that the whole class has mastered the material particularly well relative to previous terms, then I could have shifted all grades, but I never encountered this.</p>
<p>In terms of the AP exams, I do not know whether or not a 5 really represents mastery of the material, but if you really wanted different schools to be able to use this info for admissions and class credit, couldn't the College Board release the RAW scores as well? Then individual colleges could assign credit based on something other than the 3/4/5 scale.</p>