Grade inflation at colleges: good thing? or bad?

Are the standards lower now? I have never been a fan of curves. Why penalize someone for being in an amazing cohort or rrward them for being in a lousy one. What work has been done to put exams taken 20 years ago to the regrading test?

I can think of many reasons that grades might be legitimately higher. Some are good - more transparent info sharing, better access to expectations, additional materials, help etc. Some less so - avoiding classes where one might not do well. The latter is mentioned in Excellent Sheep, for example.

If grades will matter for jobs, students who are thinking about that may not take an intro class in CS for the exposure or may not brush up on spanish even if it’d make them more marketable. This seems to me to big a bigger worry.

3 Likes

My experience: undergrad in the early 80’s. STEM at a large public university.
Massive grade deflation.

It’s all how a university wants to build their reputation.

“Wow, if THAT guy has a 2.3 GPA, then can you imagine how good someone with a 3.2 GPA must be… that sounds like a really good school”, etc.

Most elite colleges allow students to take some courses on a pass/fail basis, if the courses are non-essential to the students’ majors or the colleges’ core requirements. So I don’t think grade inflation is necessary for students to sample courses outside of their main interests in college.

Not my kid’s experience!

But it might even keep a kid away from something beyond sampling. I knew kids in college who hung in there with Chinese and Arabic because they were interested. And I know kids today who have walked away from “harder” languages because it’d hurt their GPA.

When I was in college, the grade you might get wasn’t a factor in course selection. How much reading, whether there was a big paper, how engaging the prof, time of day – that might have been a consideration!

This^. I know some pretty stellar students at Harvard. In my opinion, there should be no concern that the high GPAs there might reflect less learning–students there are, on average, extremely bright AND working unbelievably hard and learning amazing amounts/things. I’m sure you can find studies/reports of the number of hours students study at various schools, and I’m confident you’ll find the students at H working exceptionally hard and at the library on the weekends much more than at many schools that have lower GPAs. I do not believe the fact that average GPAs are high there means that students are not pushing themselves and working hard and excelling in learning. The students I know did have several classes with the curve situation described above, with the median grades around 35%. This with a student body that were primarily exceptional high school students (yes, some small percentage may not be as strong as their hooks really contributed to their acceptance, but the vast majority are extremely impressive). They didn’t face this situation by saying, “well, I’m not going to study because I’m going to count on the fact that there will be a strong curve and I’ll probably wind up with a good grade.” Rather the thinking is, “Holy cow, that exam is going to be unbelievably difficult. If I don’t work my butt off, I won’t be able to answer ANY of the questions, and I’ll get a zero”. The difficulty of these exams is unlikely to be similar to exams at schools that are far less selective.

I’m not sure why median grades seem to keep rising, but agree that it seems logical to be a combination–pressure to help students get into fields/grad schools where certain GPAs are required, but I also agree that the caliber of students at top schools seems to be rising (perhaps as a larger percent of students go on to college over time, making it more competitive? I also think the intense competition to get into top schools has trained kids from a young age to study like crazy). To me it seems clear that students at top schools are working harder (on average) than when I went to school. So they may be brighter (due to increased competition while number of seats hasn’t really grown) AND working harder. So I guess some amount of rise in GPA shouldn’t be that surprising.

2 Likes

I know it’s anecdotal - but it certainly matches with my own experiences, and the realizations I started having watching my daughter going through high school, and then college.

I had always been among or the top of my class, including high school, and beyond - but I was floored when I saw the discipline and work hours that my daughter (and her high-achieving peers) put in (without me standing there with a whip).

I’m not exaggerating by stating I probably had put in 1/3rd of that effort, if that much. I suspect my class rank today would be barely “middle range”, while excelling in just a few subject areas.

Also, the quality and depth of material my daughter studied in HS was way beyond anything I touched (such as, in Chemistry or Biology, English literature and language analysis, foreign language proficiency,…).

At least in my own little sampling group that is certainly true.

And yes, her grades outpace the “great” grades I always thought I had - and it’s well deserved.

3 Likes

So - something becomes real by the mere fact that its existence is being suggested?

Not every “increase” is an “inflation”. If the price has increased for “a goods” that now has added qualities, then there was no “devaluation”. It was simply an increased price to reflect an increased value.

I can honestly state the today’s students that I’ve met as my daughter’s peers are better educated, in breadth and in depth, than I was. I could not pass tests (and not because of forgetfulness) that students today are acing - in subjects that I used to do very well in school.

So maybe the steady increase in grades does represent the better education level of students in top schools.

On the other hand, that doesn’t mean the grading system at colleges couldn’t be revisited and improved upon. Maybe it’s time for colleges to agree to adopt weighted/unweighted GPAs nationwide, as we know them from high school, as condition for accreditation?

An “A” in a standard-level class is NOT worth as many GPA points as an “A” in “honors” or “genius-level” classes? U/W GPAs would range up to 4.0, but after a “boost” for difficulty level, the weighted scale would range up to 5.0.

This way a B- in a difficult class would have the same GPA effect as an A+ in an easy class - thus rewarding those willing to explore a more “rigorous” course level.

I don’t disagree with any of your observations. My kid is a lot more advanced than I was, and she studies very hard. She is also attending a highly demanding, top ranked school; and maybe I am using that to help justify her results.

However, this raises some questions.

  1. How does one know that students at “lesser” schools aren’t working harder and aren’t taking more challenging courses than peers at top, name-brand schools?
  2. How does one know whether a student attending a “lesser” school is less or more capable than those same peers? If college students are universally receiving similar high grades that fall within a very narrow band, it becomes more difficult to discern any difference. Prospective employers and admissions officers will have to rely even more heavily on the reputation of the university and not the individual student’s performance.

If everyone is receiving similar grades, there is less incentive to study hard and try to do better.
Is the US higher education system going to evolve and become similar to the Japanese system (stereotype), where in the end, it doesn’t matter what you do during your college tenure, it’s only the name printed on the diploma that really counts.

1 Like

Fair questions, but I am not sure how they become any less difficult to answer with a wider spread in grades. A wider spread helps you better parse between students at the same university, but it still doesn’t inherently help you compare applicants from two different colleges.

Grad schools tackle this problem, at least in part, via standardized testing. Would it be feasible for the big players in tech to get together and design something similar for, say, graduating CS majors? I’m not sure how else you would address this, absent some sort of absolute system of subject grading across all colleges. And that does not seem realistic to me.

It’s standard for CS (and other tech) employers to include many technical questions within the interview, partially in an effort to evaluate technical ability. They do not just assume a 3.x GPA from college A means he/she is an expert in the desired CS related field and will be successufl on the job. At some companies, it may be a full day of interviews with hours of tech questions before the hire. Some do a pre-screen with more basic tech questions over the phone before bringing the candidate in. When I graduated, I had interviews where the interviewer barely said “hi” before asking me to solve technical problems.

You can read example tech interview on many online sites. For example, Glassdoor reports make it sound like Google sometimes gives CS applicants 4 rounds of tech/coding interviews and one round of soft skills interview.

When comparing resumes or applicants, most employers of new grads do not focus on either college name or college GPA, beyond passing a basic .3.0 threshold type screen. For example, in the survey at https://chronicle-assets.s3.amazonaws.com/5/items/biz/pdf/Employers%20Survey.pdf , employers ranked the following criteria as highest to lowest importance for hiring decisions among new grads:

  1. Internships – 23
  2. Work Experience During College – 21
  3. College Major – 13
  4. Volunteer Experience – 12
  5. Extracurricular Activities – 10
  6. Relevance of Coursework – 8
  7. College GPA – 8
  8. College Reputation – 5

Depends on the “grad school”. Many professional schools (e.g. MD, JD) rely heavily on standardized testing. But PhD programs may not rely so much on them; note that GRE subject tests are only available in four subjects now, down from more in the past, presumably due to lack of use.

It is fairly common in the computing industry for interviews to be structured somewhat like standardized tests (within the employer, not across employers), where interviewers (or at least some of them) ask the applicants to solve various types of CS problems. This is presumably an attempt to more consistently measure technical skills, although technical skills are not the only criterion in hiring. But it also has spawned a “CS interview prep” industry (books, web sites, etc.).

In the course of interviews, we often give applicants problems to solve to test mastery, knowledge and reasoning. My sense is that this has become quite common but I think it has,less to do with education and more to do with the fact that companies in general are less prepared to train employees.

1 Like

I’m not trying to suggest that our grading system 20-30 years ago was better than what we have now. I’m just suggesting that change brings about different as well as unanticipated challenges.
One thing I noticed is that the recent change in the undergraduate admissions process has caused a huge increase in applications to highly competitive universities. In part, fewer high school applicants now feel that they are unqualified to apply to those schools.

Will a change in the undergraduate grading process where the vast majority of students earn high grades have a similar effect? I remember when I was a university student, it was not uncommon to hear from classmates that they were no longer considering applying to medical school or law school because their grades weren’t strong enough. It seem to me that in a school where 80% of the graduating seniors have a 3.7+ average, very few students will face that problem. If this trend is spread across the entire US college landscape, grad schools and employers will all receive an avalanche of “qualified” applicants. Given the finite level of resources available to devote toward hiring or for admissions, how will they decide who to choose? Will it be some kind of “holistic method” that they are using for undergraduate admissions? Could there eventually be a public backlash against graduate programs and employers who are exclusively choosing applicants who attend prestigious schools because applicants from the lower part of the higher education pyramid are effectively and consistently shut out?

Professional schools may simply up their GPA minimums, further incentivizing grade-grubbing and GPA gaming (avoiding hard courses or taking them passed/not-passed, etc.).

PhD programs already do holistic evaluation (but focused on achievement and research in the field of study, not myriad non-academic aspects), as do employers. Indeed, most employment hiring is generally far more subjective and far less of a semi-regulated semi-fair competition than college admissions is (nepotism / referrals / connections, the employment analogue of legacy preference in college admissions, is common and widely accepted / encouraged in employment hiring).

I take self-reported surveys with a grain of salt. If one looks at the incoming class profiles at Google and Apple, it does not appear that in practice they prioritize volunteer activities over GPA and college reputation in hiring.

1 Like

I’m sure most Harvard students have a high cumulative GPA, but I very much doubt that 80% of the graduating seniors have a 3.7+ average. A newspaper survey is likely to overestimate GPA for a variety of reasons including a bias in which students choose to tell the newspaper their GPA, students listing the GPA on their resume which may be in-major rather than all classes or rounded up, students exaggerating/lying, etc. A better source for a rough estimate might be GPA thresholds the cum laude honors which are given to ~half the class. These thresholds were as follows. Note that I separate pre and post COVID in an attempt to separate effects of remote learning.

Pre-COVID (2019) – 3.623
Post-COVID (2021) – 3.736

Among Ivy+ colleges, I’d expect Brown to have the highest cumulative GPA (see earlier post about Brown use of S/NC), followed by Harvard and Stanford. Among non-tech private colleges, there is a general loose correlation with selectivity – the more selective and greater concentration of top academic students, and the larger portion of students receiving A grades. The highly selective private colleges that get the most discussion on this site generally have cumulative GPAs somewhere above 3.5 and below the Harvard numbers above – in the A- range.

Professional school admission is a contributing factor to why the GPA tends to follow selectivity like this rather than the average GPA at Harvard being the same as the average GPA at a less selective private college. Suppose a student who was interested in law school or med school was deciding between a scholarship at Sienna vs higher cost at Harvard and was considering which college would offer the best chance at med school admission. GradeInflation.com says in 2014 the average GPA at Sienna was 3.23. Harvard’s may have been a little under 3.6 during the same year.

I don’t think it’s obvious at which of these 2 colleges it is easier to achieve the desired high GPA for med/law school. Instead it likely depends on the particular student. Some students will likely find it easier to achieve the desired GPA at Sienna and others at Harvard. If Harvard instead had a 3.23 average GPA like Sienna, then choosing Harvard over Sienna would more likely be a disadvantage for students targeting med/law schools after college since it would likely be easier to rank in a higher percentile of the class when the class has a smaller portion of top academic students.

I take looking at incoming class profiles, without considering correlation vs causation with a grain of salt. For example, there are a lot of reasons why Google/Apple might be more likely to hire high GPA students than low GPA students besides focusing on the students GPA as listed in resume. Even if Google/Apple went GPA blind and did not know students’ GPAs, I’d still expect that hires would be far more likely to have a high GPA than low GPA. I’d make similar comments about college name.

That said, I think some might be surprised by the college names that are most represented at certain tech companies. For example, LinkedIn suggests San Jose State is the most represented college at Apple, and #2 for jobs with “engineering” in their title. It’s not just the “grade inflation” colleges that have been the focus of this thread.

1 Like

I’ll take that as your long-winded way of admitting that tech companies are as full of it about their stated priorities in hiring as they are about their stated commitment to diversity in employment.

San Jose State has 27,000 undergrads and is right next door so I don’t know why that would surprise anyone.

Just a reality check for those who seem to believe that GPA’s are the be-all and end-all in who gets hired:

One datapoint. Just one, and for companies that ask (and not every company asks, or cares) it is used in the very early stage of the screen. It’s not as though you get to a decision meeting after a round of interviews and someone says “Wow, Julie showed a lot of creativity in her problem-solving” and someone else says “But Joshua had a 3.9 and Julie only had a 3.7 so clearly Joshua is smarter, let’s hire him”.

One datapoint, which if it is used, gets ignored once a decision has been made to progress to an interview. IGNORED. If it’s used, it’s as a screening device, NOT as a tool for making a decision.

Re: tests, case interviews, etc.- I’ve worked for many companies which have used them, and like any other tool, there are positives and negatives. When developed and used correctly they are often highly predictive and are an effective assessment tool. When used incorrectly, it’s like the old “garbage in, garbage out”.

But companies which develop their own assessments, are MUCH more confident in the results than GPA which is subject to all sorts of things (not just grade inflation, but also selective cherry picking of classes. The phenomenon of kids retaking subjects they already took in HS is rife at some colleges.) I’ve administered all sorts of tests which “qualified on paper” candidates could not pass. But again- passing is passing. Nobody at the final round of interviews asks “how did he score on XYZ?”. It’s a bar- and if you’re over it, you’re over it.

My very first job out of B-school a million years ago was at a company which administered their own test. I was sure I had flunked the math-- and was thrilled to get the call for an interview (so clearly I had passed). I asked someone in HR after I started, “By the way, how close a call was it?” and she laughed- passing is passing. What happens AFTER that- three rounds of interviews-- is what matters once you’re over the bar.

Just so you guys don’t think that a grade-inflated college graduate gets to ride on his or her laurels for their entire existence. One datapoint.

4 Likes

I’m not sure how you’d draw that conclusion from the posts. The survey numbers from the earlier post are an average of hundreds of employers from a variety of fields – not just tech employers. However, when filtered for only science + tech companies, the order was similar, with tech employers marking internships + relevant work experience as by far the most influential on average, and GPA + college reputation among the least influential evaluated criteria.

I would not assume that a particular tech employer seeming to have a lot of high GPA hires or a lot of hires from a particular college means tech employers as whole are “full of it.” Before drawing such a conclusion, you need some degree of controls to distinguish correlation vs causation to better review if the GPA/school name is the primary reason for the job offer, or whether it is primarily that GPA/name is correlated with something else that the employer values, such as having the desired technical skillset to be successful on the job, or performing well in the long series of tech interviews. You’d also need to control for how many apply from that college, as some schools have far more applicants to particular tech companies than others, such as the San Jose State → Apple example.

One way to review might be to compare employment results for similarly selective colleges of grads that share similar types of individual characteristics, but attend colleges with noteworthy differences in average GPA of graduates. For example, one might compare Harvard vs Princeton during the period when Princeton had a max 35% A policy and 3.2x GPA average. Were tech companies more likely to hire Harvard CS grads than Princeton CS grads during this period? Did Princeton’s tech employment outcomes notably change during the lower GPA period? Regarding the latter question, large differences do not stand out to me in the available outcomes report for CS grads. For the 2 companies you listed, the portion of Google CS hires was larger in the lower GPA period than higher GPA period. The number of Apple CS hires was quite low in both periods. I’d expect law/med schools are likely to show a different pattern, with GPA being more likely to influence outcomes.