Devaluing of Class Rank ... A Good Move? Alternatives?

<p>Class rank should only be calculated on an even playing field. While I think those kids who had the money and time to take online AP courses over the summer while our kids were on mission trips or band camp or Governors Honors programs should get their college credit I don’t think taking those classes outside of your usual stressful class environment should be used to pump up GPA and boost class ranking - your not in that class.</p>

<p>

</p>

<p>Scattergrams don’t cause cheating…Students cause cheating! ;)</p>

<p>That said, my son’s GT school didn’t rank, but did provide distribution information.</p>

<p>I do, think, however, when you combine an UNWEIGHTED GPA with Rank and known school rigor you can pretty much guess if the student fits the U’s profile.</p>

<p>What I think is more meaningful to an adcom is a method of determining HOW MANY students perform at a given level (eg. flag for grade inflation…)</p>

<p>Ranking often fosters a cutthroat environment that promotes strategically playing games with regards to weighted rank and taking classes such that their unweighted/weighted rank will be highest. Of course, you can’t fault them for playing the game, as many other driven students feel compelled to as well, but what kind of environment is high school becoming? It seems so contrived… kids bending over back, cheating, taking shortcuts just to show that they can be the most skilled hoop jumper. I don’t get it. In the end, I suppose you can look at this as a game theory situation in which colleges want to see a (arbitrary) ranking of students, and so students must follow suit and position themselves in the best light possible, but the optimal outcome might be to eliminate class rank outright. I can’t say this with much authority, though, as we don’t know for sure how selective colleges prefer rank vs. unranked in reference to rodney’s point about college rankings (the game they themselves have to play).</p>

<p>

</p>

<p>

</p>

<p>But why would scattergrams incite the grade-grubbers or cheaters any more than the grades themselves already do? </p>

<p>The idea behind the scattergrams is that colleges can see students in the context of their individual classes. Given that I hear students fret all the time about how colleges won’t know what a hard grader a particular teacher is, the scattergram should reduce more stress than it causes. It will reassure these students that the colleges will indeed know how few A’s were awarded in some classes or how many students ended up at the B- side of the graph. </p>

<p>Keep in mind, too, that, although the students should be able to access these charts via their guidance counselor just as they can access their transcript upon request, the scattergrams not going to be posted on the classroom bulletin board. So students won’t even see them unless they ask to see them … and many won’t.</p>

<p>For the scattergram method, I’d like if the school were required to report average test scores for each GPA range as well as class rank distributed among those ranges. My school has an average ACT of 30.5 with our top 10% (GPA ranges of 4.40-4.52) having scores of 35 or 36. The system would be flawed because it would show the school as focusing more on test scores than on coursework and may encourage more fraud in admissions, but I feel it’d validate the rigor of a particular program more than just AP scores would. Having said that, though, I’m fairly biased since other public schools in my county offer higher GPAs (our highest is a 4.5, the highest at my base school is a 4.7) but don’t have quite the rigor of curriculum or depth of students ours does (the 4.7 person, for instance, has a 29 superscore).</p>

<p>Also, to add to the debate above, it seems students would be better off not being able to see their ranks at all, but rather to have scattergrams that demonstrate where they fall in a class. If one were to take the hardest class in the school and only come up with a high B but the grade happened to be the highest in the school then that should be reflected in a transcript. That is, as long as you hear “Oh, we don’t rank” I seriously doubt you’d feel the need to sabotage other students when tutoring or something to that extent.</p>

<p>

</p>

<p>This is exactly what is happening at my school! Three friends and myself each took three AP classes our sophomore year. However, two of them took pre-calc and chemistry over the summer and hence were in the level above me. But, since they had the extra credits, they ended up being fourth and fifth while, by the same fluke, I ended up being tied for first.</p>

<p>This is why I don’t really believe in the idea of class rank. I know that my rank does not reflect the effort that I put in school compared to my classmates. Hence, it’s unfair that they worked harder and have a lower rank. The school’s solution? Make even more classes weighted (like orchestra) so that people can take these classes. To me it seems like more grade inflation.</p>

<p>I like OP’s idea of a scatterplot, or at least some sort of an image or graph to show it. The suggestion to provide decile (personally I favor 5% b/c of my large school) is also a good idea.</p>

<p>Don’t the colleges all do their own calculations of GPA giving weight to various classes as they see fit? If so, class rank seems less important.</p>

<p>Any ways, lots of good posts about class rank anomalies that suggest colleges should place less emphasis on sal/val.</p>

<p>Some actually do, and that’s a fairly valid point when you consider a student who gets all 100s except in one bad class could have a lower rank than a student with 90s across the board, but when the student with 100s applies to Harvard he or she would have a higher GPA. Also, if you look at the Common Data Set for several select universities (Harvard and Berkley come to mind), class rank isn’t as important as GPA. I think what schools try to do is establish a context of your surroundings that would otherwise be vacant without a class rank; however, in assessing rank, there could be more efficient methods as in a scatterplot.</p>

<p>

</p>

<p>Harvard’s CDS demurely says that everything is “considered”. :)</p>

<p>Class rank is like the tax code. No matter what scheme you come up with, it’s unfair to someone. :D</p>

<p>Umm, according to this: [The</a> Office of the Provost | Common Data Set](<a href=“http://www.provost.harvard.edu/institutional_research/common_data_set.php]The”>http://www.provost.harvard.edu/institutional_research/common_data_set.php)</p>

<p>The only things “NOT considered” are class rank, state of residence, and religion. Everything ELSE is just “considered” :p</p>

<p>My high school - thankfully - does not weight. Students aren’t penalized by taking courses outside of high school, taking fine arts, or pulling up the math grades from middle school some students are eligible for. It can be irksome when a student has taken none to few AP courses and is ranked highly, but overall I think it causes less stress. Students who took lots of APs will be rewarded when colleges look at their courses.</p>

<p>What bothers me though, is last year I had class rank 7 (/~470). I received a 4.0, and yet, this year I’m 8th. I would accept this if a student had taken a 7th hour course or summer courses to jump over me - it wouldn’t be perfect but it would be pretty fair. However, this jump is caused by a student in the Running Start program. All of our regular and AP teachers, and students, generally accept that Running Start (taking college courses at local colleges) is easier (the colleges around our school aren’t too difficult). These kids take 2-3 colleges courses (~5 hrs each) a week, have very little homework, and receive a massive number of credits for it. My 5 AP and 1 honors course will grant me 3 credits in a semester. Running Start students gain 5 credits per class per semester. That allows for massive GPA inflation. These kids aren’t in my classes, and there is no way I could take a number of classes equal to the credit they receive. I don’t think giving them a ranking on the same scale as the rest of the students is applicable.</p>

<p>Losing one spot in ranking doesn’t bother me too much. It is what it is. But for the other hundred or so college-bound kids in my class having dozens of Running Start students leapfrog over them doesn’t seem fair.</p>

<p>My highschool doesn’t do class rank. Think about I take 10 APs and get a 93 average and another kid takes 2 APs and gets a 98 average. So I take a rigorous course-load and end up lower on the spectrum? Let’s face it getting a 98 or A+ in a regular class is a joke and doesn’t require much studying.</p>

<p>" My highschool doesn’t do class rank. Think about I take 10 APs and get a 93 average and another kid takes 2 APs and gets a 98 average. So I take a rigorous course-load and end up lower on the spectrum? Let’s face it getting a 98 or A+ in a regular class is a joke and doesn’t require much studying. "</p>

<p>Ah. I take less issue with no weighting because we use a 4.0 scale, that drops in increments: 3.7 (A-), 3.3 (B+), 3.0 (B)…
So once you’re over a 94 you’re safe. Less issue with regular classes being easier.</p>

<p>I think weighting is a good policy. Admissions officers probably already know that an AP/IB/Honors course is more difficult than a regular/academic course, but I think the incentive you give students to take more challenging courses by GPA weighing “outweighs” (haha :slight_smile: ) in importance the extra stress that it puts on students to take as many APs as possible. In either case I think class rank is not a very good ‘tell’ for admissions, however, admissions officers today probably understand that rank 10 and rank 1 are probably no different, except that maybe rank 10 decided to sleep more and, in that case, is probably a more sane human being :)</p>

<p>Do colleges actually use your number, or do they use your percentile? The Common App asks for a percentile. That makes sense I guess as there could be multiple people tied in a percentile, and it adjusts for school size. That is being number 1 in a school of 100, is the same as being number 10 in a school of 1,000. It would be even better if the schools sent over some statistical data like mean, median, standard deviation, and the quartiles to give admissions a better idea. Kind of similar to a scatter plot, but more quantitative. </p>

<p>My high school has something like “number 5 out of 500” on the transcript. We have a 4.0 weighting scale and AP/honors classes are weighted up to 5.0. I am actually guilty of taking less electives to raise my GPA. Furthermore, we do not use the A- = 3.7 thing because it is not used by the University of California.</p>

<p>“I think class rank is only meaningful when you compare within a school or at least very similar schools. Getting an A average at some is equivalent to a B or even C average at other, more demanding schools.”</p>

<p>This sums up everything.</p>

<p>

</p>

<p>This. Leniency in grading at my school is DEFINITELY unequal.</p>

<p>Here is a tale of two schools. One is the all-female version of the all-male school. Similar facilities, lots of siblings overlaps, same SES, same size, and of course same cost. The female school clings to a non-ranking policy to protect the self-esteem of its pupils and avoid the competition. The boys slug it out.</p>

<p>Verdict? The boys collect admissions to the most prestigious schools, collect scholarships, and are seemingly happy. Last year TEN percent of the boys earned a Gates Millenium award. The girls have only sent TWO students to HYPS in the last decade and very few to schools considered darlings on CC. No meaningful national awards in the past three years. In addition, very few even attend the state flagships. The students are not particularly happy and there is a lot of infighting. </p>

<p>Bottom line? The lack of ranking has been a tremendous handicap. It happens when the school answers requests for rankings with QUARTILES info. Tell HYPS that your valedictorian is an estimated top 25 percent and the consequences are real. </p>

<p>PS This is a real story.</p>

<p>Here’s another situation where rank hurts. In a compromise move, my DS’s high school decided to eliminate rank but they will rank by decile if the student wants it on their transcript. So the top students are only listed as top 10%. If you calculate Academic Index using Michelle Hernandez’s formula, a student who is in the top 10% will fair far lower than one who eliminates rank and just uses their GPA, by a wide margin. A student with 2300 SAT and 4.4 GPA who doesn’t include rank may grade out at a 233 (9/9)on the formula while the same student who lists his/her rank as top 10% will score out at 220 or so (6/9). It was counterintuitive but many high ranking students at my son’s high school chose not to include rank at all, rather than be lumped in at top 10% when in actuality they were top 2% or top 3% and the school in an effort to please all, pleased few.</p>

<p>

Scattergrams at in the individual class level are impractical. OTOH, **scattergram of the GPA <a href=“minimum%202%20decimal%20places”>/b</a> would be quite easy to create, and intuitive to interpret and will provide a true sense of the a student’s class rank. </p>

<p>Similarly, the total number of AP’s offered by a school is a good measure, but it can be augmented with histograms for #AP’s-taken, #Honor’s-taken for the graduating class which will shed additional comparative light on a student’s academic rigor relative to own peers.</p>

<p>(posting w/o the benifit of reading the Higher-Ed premium content article)</p>