<p>Let’s be honest, there are two factors at work here:</p>
<p>1) Software engineering is a rapidly changing field where, like medicine or law, if you don’t keep up on your continuing education you become unemployable after a certain number of years. This is largely because of your abilities.</p>
<p>2) Computer programmers are the whiniest bunch of drama queens on the planet. They want to do everything their way and only their way. They are highly averse to learning new things that clash with their preferences. A UNIX/Linux fan would rather walk across hot coals than learn .NET/VS/TFS/anything MS, somebody who loves Java and C# would rather die than bother with C. From the IDE to the versioning software to the compilers, OSes, languages, languages, languages, engineering methodologies (component vs. object oriented, design patterns, “best practices” debates etc.), we are stuck in our ways and do not like to change. If there is heavy market demand for C# programmers who are skilled in TFS versioning but you are a diehard MS-hater or and old-school OOP-hater, you won’t bother to learn those skills.</p>
<p>I haven’t graduated yet, but so far I’ve done programming projects that would not have been my choice, and learned languages and other technologies that I would prefer not to. But I’m glad I did. Given my own choice, I’d only ever work in C/C++/Python on *nix systems, and yet I’ve done both labVIEW development and C#, XNA, TFS, and .NET in my classes or research. It taught me that it’s not a waste of time to learn a language or technology or methodology even if I’ll never use it again after that class/position/job/etc. And it also taught me that learning things outside my comfort zone (both as a MS-hater and as a person skeptical of graphical programming languages) aren’t so bad and can still include the types of programming challenges I enjoy.</p>
<p>But I had these things forced on me. Had these been potential jobs to apply for rather than classes or necessary research stints, I would have passed and only stuck with what I was comfortable with.</p>
<p>Don’t you think that when OOP became the norm in industry, a lot of old-school programmers who didn’t want to throw themselves into OOP had trouble finding work? I’ll bet the OOP-revolution alone counts for a lot of “ageism” anecdotes. Now, if so many programmers are set in the ways of their generation (I have a CS prof who still likes goto statements), I don’t imagine our generation or any other generation of programmers would be that different. It may not be OOP, it may be a language or something else.</p>
<p>But the point is that if a sufficient fraction of programmers don’t continue to educate themselves, then hiring managers will notice the trend and <em>will</em> use age as a first-order filter for getting rid of applicants.</p>
<p>This analysis ignores the actions of HR offices, which are pure evil and should be completely separated from the hiring process.</p>