What grade would the College Board AP Boss get in AP statistics?

<p>Trevor Packer, College Board's executive director for the AP program, sent out a memo forwarded to various AP teacher email lists uncritically citing a statistical analysis reported in Jay Mathews' article in today's Washington Post.</p>

<p>You can see Mr. Packer's entire email in the public archives of the AP statistics teachers list at:
<a href="http://mathforum.org/epigone/ap-stat/drothunggler%5B/url%5D"&gt;http://mathforum.org/epigone/ap-stat/drothunggler&lt;/a&gt;&lt;/p>

<p>In his email, he writes: </p>

<p><< Especially profound are the findings demonstrating the impact of AP on students who attempt an AP course, but then only score a 1 or 2 on the AP Exam. The journalist [Jay Mathews] reporting this new research from the National Center for Educational Accountability goes so far as to say: "Every AP teacher in the country should copy that chart, blow it up to three-by-four foot size and tape it to the wall of his or her classroom." I encourage you to take a look at this article, and to share it with those who need more information about the impact AP is having on students, even those who may not succeed on the AP Exam.>></p>

<p>and he points the AP teachers to Jay Mathews' article containing the following table:</p>

<p>AP's Impact on Texas Students
Percent of Texas high school students receiving bachelor's degrees from Texas colleges and universities within five years of graduation:</p>

<p>Anglo (47,647 students) 57% 43% 22%
Hispanic (19,868 students) 47% 26% 8%
African American (7,813 students) 42% 36% 11%
Low-Income (22,028 students) 40% 24% 7%
Total (78,079 students) 57% 37% 17%</p>

<p>The first column lists the figures for students who passed an AP exam with a score of 3 or better, the second column lists the figures for students who took an AP exam but did not pass (scored a 1 or 2), and the third column lists the figures for students who did not take the exam. Thus, for example, among Hispanic students, 47% of those who passed the AP exam got their bachelors within 5 years of high school graduation, compared to 26% of those who got 1's or 2's, and only 8% of those who did not take an AP exam.
(Source Jay Mathews' article at:
<a href="http://www.washingtonpost.com/wp-dyn/articles/A6900-2004Nov23.html%5B/url%5D"&gt;http://www.washingtonpost.com/wp-dyn/articles/A6900-2004Nov23.html&lt;/a&gt;)&lt;/p>

<p>Thus, what Jay Mathews and Trevor Packer are claiming is that these figures clearly demonstrate that the impact of taking AP is to raise greatly the probability that a given student will complete college within 5 years of high school graduation. </p>

<p>Not surprisingly, the AP statistics teachers are rather skeptical of this simplistic analysis. Some of their email responses to Mr. Packer's email point out that this is a classic illustration of how "correlation does not establish causality." Clearly students who CHOOSE to take AP exams differ in many important respect from those who choose not to take the exams. On average, those students who choose to take the exams will tend to be more motivated students and to have stronger academic backgrounds to begin with. </p>

<p>One imagines that if Mr. Packer and Mr. Mathews submitted such a superficial analysis on an AP Stats free response essay that these AP stats teachers would not give them a very high grade!</p>

<p>Now Jay Mathews' freely admits that he is befuddled by statistics, graphs, and charts. He states right up front in his article: " I do not use many charts and graphs in this column. They can be confusing, and I often don't understand them anyway." Like many journalists, it's clear he is more comfortable with words than with numbers. I wouldn't necessarily expect him to ace the AP statistics exam himself.</p>

<p>But the College Board is supposedly run by "psychometricians," who should, in theory, be careful in their use of statistics to support their conclusions. At a minimum, one would expect the head of the College Board's AP program to be capable of statistical analysis that would pass the AP statistics exam! </p>

<p>So I was pretty struck by Mr. Packer's failure to note that there are important statistical caveats that should be taken into account in drawing any conclusions about "the impact of AP on students who attempt an AP course."</p>

<p>Well, this might not be perfectly statistical, and god I'll be damned if I understood stats at all, but it's an interesting correlation. Of course, there are so many situations that could contribute to this, especially because APs are not free. But I think with common sense, some reasonable inferences, and realization that the chart is not absolutely correct, I see that this chart demonstrates how APs impact students performances in college.</p>

<p>To wade into this discussion, homeschoolmom is objecting to the idea that the stats presented "demonstrate how APs impact students' performance in college." I agree that there is a correlation between students taking APs and their performance in college; what is NOT demonstrated is the impact of AP on college performance. As homeschoolmom points out, students who take APs are more motivated to begin with, and probably feel better prepared to take APs than those who do not. One would thus assume that, in college, they would benefit from this greater motivation and better preparation than their peers who did not take APs. This would take the form of better performance in college. But this better performance does not demonstrate the impact of AP.</p>

<p>You are all correct.</p>

<p>I'm a big fan of Jay, but, in this case, correlation does not equal causation due to numerous externalities. My S is taking AP stats, and just had that type of question on his recent mid-term. I think I'll print out the article and see if he can find the logic error. :-)</p>

<p>I also like Jay Mathews a lot, and while I agree that the chart he printed does leave a lot of variables out, I think the chart does support Jay's general argument (and it's an argument that he makes a lot): the best way to prepare kids for college is to challenge them in High School. He happens to believe that AP/IB is the best way to do that, and he strongly believes that schools should be encouraging, and allowing, ALL their students to take these challenging courses. Jay makes a good point that if we let college bound kids slip through HS without getting a taste of college-level work, they are far more likely to fail when confronted with a college workload. I think the chart, IN GENERAL supports his argument, even though it doesn't address a number of important issues. </p>

<p>Jay has a good habit of re-printing e-mail reactions to his columns, even when (as in one case) they compare him to Hitler. So if you really have issues with the chart, go ahead and e-mail him. I've found him to be remarkably accessible for a big-city journalist. </p>

<p>On a related note, the WSJ had an interesting article last week on how "elite" high schools are opting out of the AP program -- I look forward to Jay's response to this.</p>

<p>* So if you really have issues with the chart, go ahead and e-mail him.*</p>

<p>I don't have issues with the overall thrust of Jay's message that challenging high school classes are the best way to prepare for college. </p>

<p>And, as a practical matter, for the typical public high school, AP or IB courses are probably the most accessible route for challenging students. (The private schools that are abandoning AP have a faculty with very high levels of education and a low student-faculty teacher ratio that frees up time for inventing their own advanced curricula, tailored to the needs, interests, and backgrounds of their students. It's also interesting to note that their students typically take the AP exams ANYway, even though they may not take AP courses to prepare for them. Their generally strong intellectual preparation along with some self-study often enables them to do quite well, despite not taking classes targetted specially towards the test. Homeschoolers often do the same thing, by the way. They often read very widely and write very prolifically in a self-designed program that doesn't particularly relate to APs and then, down the road, a bit of structured study aimed at filling in a few gaps, may enable them to do very well on the exams.)</p>

<p>And I don't have a huge problem with Jay's use of the chart--he's a journalist, not a psychometrician, and his readers will presumably take that into account. </p>

<p>What I DO have a problem with is the head of the College Board's AP program uncritically endorsing this analysis in his memo publically posted on the web in several places, without any mention of the statistical caveats that any responsible psychometrician should attach to such an analysis.</p>

<p>It seems to me that the College Board has a responsibility to set a better example of the responsible use of statistics to draw conclusions from data!</p>

<p>Again, I want to stress that I don't necessarily disagree with the end conclusion that APs (or IBs) are a valuable preparation for college, just the uncritical use of statistics to support that conclusion.</p>

<p>And, by the way, it is entirely possible that a more complex study might demonstrate that students who take IBs have higher success rates than students who take APs (or vice versa)....and I suspect that whichever organization came out on the bottom of such a study would be the first to point out the statistical caveats that need to be applied in interpreting the data. The populations of students who CHOOSE to take a given course or exam may be different a priori.</p>

<p>Oh I absolutely agree with you on that score, homeschoolmom! I do get tired of these organizations that take any and every positive snippet about their program/school/test and make a big promotional press release out of it. Funny how when a school drops from, say, 45 to 49 in USNWR the school never says a peep (or at the most discusses the inherent weaknesses in all ranking systems, etc. etc. ), but look what happens when they go from 45 to 41!</p>

<p>Jay is back with an update:</p>

<p><a href="http://www.washingtonpost.com/wp-dyn/articles/A30972-2004Dec28.html%5B/url%5D"&gt;http://www.washingtonpost.com/wp-dyn/articles/A30972-2004Dec28.html&lt;/a&gt;&lt;/p>

<p>Great reading for those that need support to expand/maintain honors/AP courses.</p>

<p>Thanks, bluebayou, for the update.</p>

<p>One of many things Jay says is this:</p>

<p>"At the very least, the UC undergraduate admissions system will have to rethink its 22-year-old policy of adding a full grade point, making every A worth 5 points rather than just 4, to any grade in any course labeled college-level or honors, even if no college-level exam is given to verify its rigor. "</p>

<p>Our high school does NOT award any weight for an AP course UNLESS the AP exam is taken. I assumed every school in the country (of the ones that weight) did this. I may be wrong. But it certainly seems wrong of the UC admissions system to add a point if the exam wasn't taken.</p>

<p>Thanks for posting this article, bluebayou. It's much better than the previous one.
I've read about schools that have AP classes where very few students take the AP exam; I've been rather perturbed by this phenomenon. Without the validation of the exam, there is no way to tell about the quality of the class or the level of preparation of the student. Unfortunately, since seniors take AP exams long after admissions have been decided, there is no way for colleges currently to evaluate the AP classes they are taking. The solution, however, could be for the school profile to include more information about AP classes, AP exams taken (and proportion of students taking the exams), and the scores achieved by previous exam-takers. Since scores are sent to schools, the schools could easily compile such data and include them in the school profile. Then everyone could see whether an AP class is worth a lot point more than a non-AP class at that particular school.</p>

<p>nedad:</p>

<p>Our school does not weight. Period. Many schools add part of a point for honors and one point for AP classes. Since AP scores come back in July, the class grade is not dependent on the exam score. Your school must be in a minority in taking the AP score into account, and could only do so for junior year and before, not for senior year, surely?</p>

<p>nedad:</p>

<p>our school adds a grade point for UC-approved classes, which includes honors and APs, regardless of whether the kid takes the AP test; (UC does not require a kid to take the AP test). </p>

<p>My S had great frustration last year in his AP course since less than half of the class took the AP test (most were seniors who had already gotten into first choice U), and, therefore they didn't want to participate in study groups or extra credit sessions to prep for the test. Ditto for this year: his AP teacher actually jokes that he 'hates' seniors since they tend to get senioritis, skip the test and the work to prepare for it.</p>

<p>Marite: I think our profile provide the info on the % of kids take the AP tests, but I'm not sure -- I'll have to check. (I've seen the data on other sources, but have yet to ask for the official profile.)</p>

<p>My son's school also does not weight for transcript purposes (they do weight somewhat for National Honor Society). Their position, and I tend to agree, is that colleges don't pay attention to a HS weighting anyway -- the college Adcoms all have their own way of looking at an application, including AP classes, AP tests, etc.</p>

<p>The final calculation of weighted grades is done at the end of the first semester of senior year - for purposes of class rank only, which is the only reason weight matters here anyway - and seniors must sign a paper saying they WILL take the test. So I believe they do weight the senior APs. Also the student must pay in advance for the test.
I don't know if there are any ramifications for those who pay and sign the statement, but then don't take the test, as I have never heard of anyone doing this. </p>

<p>However, as with other schools, the exam grade of course does not figure into the course grade, since, as you point out, it coes back too late.</p>

<p>Thanks, nedad. I am really ambivalent about both the exams and the weighting.
On the one hand, without the validation of the exam, the quality of AP courses can vary hugely. It is known that in some schools, practically no one takes the exams and yet can lay claim to be in AP classes.<br>
Our school has decided to encourage as many students as possible to take APs. This means not demanding that everyone take the exam and allowing in some students who are not as well prepared as they should be. My S's history teacher mentioned this as a concern one month into the school year; she told me that she would no longer try to accommodate the slower students so that she could cover all the materials in the class. Next year, the school is launching AVID to help those slower students without barring them from APs.
When my S took the AP Chem exam, all the students in his class took it, and only 2 from the other class did--evidence of the difference in quality of the teachers.
Students are not required to pay the exam fees until some time in the spring--which would be after the end of the first semester, and of course, well after college applications have been sent out. I believe that class rank is calculated at the end of junior year and then again some time in senior year--I'm not sure exactly when.</p>

<p>Wow...our school not only weights AP classes 1.3, but also weights pre-AP classes 1.15. It is caluclated into the GPA and the class rank standings. I really wish they didn't do that. While we have many students that take upwards of ten AP classes between their junior and senior years, we have on average 3-4 AP scholars. I question what kind of preparation they receive.</p>

<p>While our school does have open enrollment for AP classes, I find that more times than not this is an abuse to inflate the rank up to top 10% in order to secure an auto admit to UT or A & M. There are students that do not want the extra rigor, just the grade inflation, and unfortnately there are teachers willing to go along with this game. The amount of extra credit given to students with low AP Government grades this last term was ridiculous. My son reported that students who had been making below C on the practice AP exams which are part of your grade, still made low As for the term. There is absolutely no correlation between the grades received and the success on the AP exam. Last year this teacher had 2 5s, a handful of 4s and mostly 1s and 2s. So theoretically she should have given 2 As, a handful of Bs and the rest C or fail.</p>

<p>More examples: over 300 students take AP U.S. History, only 68 took the exam, and while the "pass" rate was up slightly over the past 10% figure, there were only 3 "5s". Over 60 students enrolled in AP Psychology, 4 took the test, all of them received 1s.</p>

<p>What really helped my sons with their AP exams was independent study and debate. They learned more about writing and constructing arguments, using supporting research etc...from participating in speech and debate events. Maybe they should also look at how many students in that "pass" group bought study guides and studied independently, participated in extra curriculars such as music, writing, academic team, chess, debate etc....?</p>

<p>nedad:</p>

<p>does you school make any accomodation for kids that want to self-study for the AP exam, i.e., not take the school class, or study and take an exam that the school does not offer?</p>

<p>Texastaximom:
The situation you describe is the one that makes me so ambivalent about the AP program. At least in our school, since grades are not weighted, the struggling students are not in APs in order to inflate their grades; kudoes to them.</p>

<p>Marite: Exactly! It's become a game here. I fault the top ten percent rule for a lot of it. Since it is virtually impossible to be admitted to UT Austin or A & M College Station from outside the top 10% if you are a Texan, people will pad the GPA in any way they can. Our remediation rate has skyrocketed, so the college prep isn't happening.</p>

<p>texastaximom, sounds like the AP program and the 10% rule could be the poster child for the law of unintneded consequences. . . </p>

<p>The more I read this thread the more I appreciate the way my S's school approaches it. They acknowledge the more difficult level of AP for purposes of National Honor Society membership, but don't really sweat it for other situations like the transcript. My understanding is that they don't weight GPA for class ranking, either. I suppose that can ruffle a few feathers when it comes time to determine the valedictorian, but to be honest I've never had much sympathy for the participants in the annual "valedictory battles" that pop up around the country like weeds every May. I remember watching a classmate of mine reduced to tears as he argued that his GPA was calculated incorrectly. He was right -- and a result he was allowed to sit in the third chair on stage rather than the fourth chair.</p>