<p>Montgomery County (MD) school district using easily available data to track its graduates' success in college, and to try to identify what educational program elements make a positive contribution to college success.</p>
<p>Everyone should be doing this. I'm certain they won't, but I bet this sets off a real competition among elite private schools and school districts to use this data both to convince others (parents, public, politicians, admissions officers) that they are doing a good job, and to improve the job they are doing.</p>
<p>Much of the data is depressing. About half of all students who go to college have to take remedial math. The figure is much higher at community colleges. The CC where I work had a big pow wow with our public school system math and English department chairs a few years ago and the consensus from the high school teachers was that the colleges should change. </p>
<p>If you look at high school English curriculum across the country, almost all writing (if students are lucky enough to have teachers who have time to grade any relevant assignments) assignments are literature based. Yet many college students may never take a literature class. Business, science, and engineering students take writing classes specific to their majors after they finish the basic comp class. There is a huge disconnect between what students are getting and what they need. </p>
<p>Maryland is very grateful for Montgomery County because it brings up the rest of the state in nearly every ranking.</p>
<p>Our high school would probably rather NOT have this data. Or, if they already have it they would not wish to disclose it because it would only make them look bad. Its much better PR for them to publish their numbers showing 90% of graduates go on to college instead of any data revealing how many need remedial courses or how many graduate in four years. They also like to promote largely irrelevant data about class sizes and teacher credentials, both of which have not been shown to be be significant in improving student achievement.</p>
<p>Unfortunately, public schools care more about inputs rather than outputs.</p>
<p>Apparently, it costs just $425 per high school per year. Every high school should be using this data.</p>
<p>^ Good point. Especially for the parent shopping for high schools. But maybe the Clearinghouse doesn’t sell the reports to individuals, although it looks like you’ve discovered a market for them.</p>
<p>The University of Washington shares its data with Seattle Schools and has been warning the district about student math performance. The district responded by implementing a new curriculum that the math department dislikes and which resulted in a successful lawsuit blocking its implementation. Data are nice, but using data wisely is another matter.</p>
<p>I’ve been thinking a little more about this.</p>
<p>
</p>
<ol>
<li> How the heck do they match up the college data and the AP data like that? Wouldn’t they have to have the kids’ names? To figure out how many college graduates had taken but failed an AP test?</li>
</ol>
<p>I continue to think this is a great idea, but from a privacy standpoint I wonder how many of the students included in this data understand that their high school’s superintendent has access to important information from their college records, with their names attached?</p>
<ol>
<li> Of course, it isn’t completely clear to me that the data (which of course I haven’t seen) support a recommendation that more kids take AP courses in high school. Sure, kids who take AP courses in high school do better in college than kids who don’t, even if they don’t pass the AP test. But is that because taking the AP course provided important college preparation, or because the population of students who decide to take AP courses on their own, or who are encouraged to do so by their teachers without administrative prodding, is a generally more successful population of students than the population of students who don’t take APs at all? Anyone want to place a bet?</li>
</ol>
<p>I suspect at least part of the reason for improved college performance by students with AP courses is due to the AP experience, but part is due to the self-selection factor. However, I disagree with the “APs for everyone” crowd because that inevitably leads to watered down AP courses.</p>
<p>To track academic outcomes like that, it would appear that data must be collected at the individual student level (which we know the Clearinghouse does for its original purpose of tracking students who transfer to other colleges). But it could be possible to produce reports about aggregate data like that, I think, while still following industry-standard practices as to data privacy, so that no individual student records are revealed to persons who have no legitimate need to see them. I don’t know what the actual data privacy protection practices are of this organization, but I’m glad they are gathering their data and reporting interesting findings like that.</p>
<p>In Colorado we have a statewide student identifier for all students in public schools preK-16, and there is quite a bit of analysis being done on college performance broken down by individual high school. Just this past February, the Colorado Commission on Higher Education released its newest Remedial Education report, which shows for every public high school in the state, what percentage of their recent graduates who went to Colorado public colleges and who needed remedial reading, remedial writing, or remedial math, as well as the percentage requiring remediation in more than one subject (reading and writing, reading and math, writing and math, or reading, writing and math). A separate cut of the data shows the percentage of students at each of the public colleges who need each of the remedial subjects. </p>
<p>It is important information, and as higher ed gets more and more financially constrained, they’re finally beginning to exert some real pressure on the K-12 system to fix the problem.</p>
<p>There’s a huge difference between reporting data that would be in one institution’s database (e.g., a student’s college grades, and whether he needed remedial instruction, as well as his high school, would all be in the college’s database), and matching up granular information from two different institutions (e.g., college performance vs. high school curriculum). The latter would be much more powerful, but essentially requires a lot of individual tracking. My understanding about initiatives such as Colorado’s is that eventually the state or independent researchers – who would not have access to names, just numbers – would be able to analyze the data across institutions, not that high school administrators would have access to individual information about how their graduates did in college.</p>
<p>Not that I have so much trouble with that, but I sense lots of people would.</p>
<p>As for the study in the AP book, I think it stops one level short of what I think would be important. It’s not a question of tracking, or academic background. In many schools, the students with worse academic backgrounds and little prior science who suddenly turn up in AP science courses are students whom the teachers have identified as “diamonds in the rough”. That’s certainly how it worked in my kids’ school. And it’s not so surprising that those students would later outperform their peers who were not so identified and encouraged.</p>
<p>arabrab–Wow. Are the people in Colorado happy, sad, shocked with the numbers that have been reported? I don’t think Maryland will ever do anything like that because the numbers would not be pretty. There would be some very good numbers, but some of the others would be awful. Here people would be shocked that taxpayer dollars pay for mediocre public high schools and then they subsidize the state colleges as well for students to learn what they should have learned in sixth grade. Most people around here don’t seem to associate public colleges with tax money because there is tuition involved. I would love to see the numbers for Maryland that Colorado is publishing.</p>
<p>I’m an ed policy person, so I was pretty unhappy, though it wasn’t really news to me. It did get some good press in the Denver Post, and I’ve been showing the data off at presentations for middle school parents. But before you can fix a problem, you need to know that there IS a problem. This does a good job of identifying certain problem areas. The top five public high schools in Colorado had college remediation rates in the 6-10% range. Typical high schools were in the mid 30-40% range, and many of the urban high poverty high schools were in the 60% and up range. One change that came out of a prior analysis was to require an additional year of math in high school as a graduation requirement, and to eliminate “business” or “consumer” math as a qualifying option. </p>
<p>IMO, while the urban high poverty numbers are likely to be very hard to move, a little sunshine on how poorly many of the suburban middle-class high schools did is going to do a world of good as educated parents begin asking the questions about why Amanda’s school has a 40% math remediation rate when Stephen’s school in the next town over has a 25% rate. (The other thing I remind folks is that these are kids who both graduated from college AND who matriculated in college soon thereafter. It doesn’t include dropouts or students who didn’t go on to public college.)</p>
<p>One of the things that I like to point out to middle school parents is how expensive remediation is in college, how it increases the chance that the student won’t actually finish college, and how it is strongly associated with being able to graduate within four years because of the need to take the remedial coursework before being allowed to take the regular required sequence of classes. The pocketbook issues are pretty significant. The other thing is that when you’re talking about a middle class school where 40% of the kids need math remediation, you’re talking about a LOT of kids – not something that can be attributed to “those” kids with the poor parenting, special needs, or other frequent explanations for why the data isn’t really meaningful.</p>
<p>With respect to privacy, my understanding is that the data is blinded, and neither the colleges nor the high schools see the results for any specific student. On other state data, if there are five or fewer students in a given reporting category, the data is asterisked out and instead aggregated at a higher level.</p>