<p>If you search for explanations regarding the dismal state of the job market, you will undoubtedly stumble across a number of reasons for why recent graduates are having a tough time finding jobs. However, nearly all of them look only at one facet of the overlying problem, which is somewhat short-sighted. While they are part of the bigger picture, we need to view the issue in its entirety before we choose to take a plan of action. In this way, we can develop a holistic approach to dealing with the problem. However, the complexity of the issue should not be understated, as it is an unintended by-product of a number of social and economic issues, many of which are unrelated and will have to be dealt with separately. So here goes. </p>
<p>First, let’s look at the social and demographic side of things. Over the last few decades (truthfully the last century), the population has been growing at an exponential rate. Since 1970, this has resulted in a population boom in which the number of people in the US has more than doubled. For the better part of three decades, business was able to keep up (with some relatively minor economic hiccups). Jobs were being created at a fast enough pace, and the economy was generally healthy. Even into the new millennium America was experiencing one of its most prosperous periods in history (albeit false growth dependent in part on synthetic CDOs, effectively the splitting and resale of subprime securities (rated AAA), and the housing bubble). But I digress. As the population soared, so did technological advancement, and the information age was being born. People were exchanging information and ideas at a rate like never before, and all seemed well in paradise. One of these ideas was that people who held degrees were earning more. Salary figure reports demonstrated that college graduates could earn upwards of one million dollars MORE in a lifetime than their counterparts without degrees. And so the mad rush for education began. </p>
<p>It has also been a broad, general consensus that knowledge is a right. I am one of the proponents for this school of thought. If an individual so chooses to be educated, they should be free to pursue that education. This thought, combined with the societal push towards college education, resulted in a huge surge in the number of colleges across the nation to accommodate the influx of students. Traditionally, colleges existed to educate students, not to place them in jobs. However, with salary reports, the perception shifted, and people pursued degrees specifically to attain higher paying jobs. There are a couple notes of interest on this point:</p>
<p>1) Inflation in tuition – As tuition prices increased (because of budget cuts on the federal and state levels), there was more pressure to attain higher paying jobs. Additionally, many state-sponsored scholarship programs bottomed out, leaving students to cover the higher costs. This resulted in a significant increase in the actual cost of attaining higher education.</p>
<p>2) Flooding of the job market – As more people attained degrees, obviously there was far more competition for jobs requiring degrees. As a result, many people have / are starting to go back to school to pursue advanced degrees, in order to distinguish themselves over the competition. I have seen the effect of this myself. Entry level job postings often require a bachelors degree, with a master’s STRONGLY PREFERRED. This is the common language. (I have also seen cases where they state outright that they will only accept people with a master’s.) A friend of mine was beat out for a job as a lab technician (entry level – not particularly technical work) by a guy with a Master’s degree in Materials Engineering. On another note, this perpetuates the problem listed above. Grad school isn’t cheap. This simply increases the actual cost of attaining a job requiring a degree. (So at what point is it no longer economical to pursue a degree? Eventually there’s a point where the opportunity cost of pursuing a 5-6+year degree, combined with lower starting pay, no longer justify it as a viable option). It should also be addressed that when the economy buckled in 2008, many people lost their jobs. This flooded the job market with experienced employees, making it harder still for recent graduates. </p>
<p>Alright, now let’s shift focus to the business and financial side of things. (Keep in mind I am not trying to step on anyone’s toes.) The point of primary importance here is that the court has ruled in a couple of cases that corporations are obligated, above all else (and this includes the well-being of the community as a whole), to serve the interests of its shareholders (see (1) Dodge v. Ford and (2) eBay v. Newmark). While there is a fair amount of debate as to the validity of this idea, it still has precedent in court and is enforceable by law. If a company chooses to be philanthropic, shareholders can contest that it did not represent their interest. As a result, companies are required to seek ways to maximize profits and minimize expenditures. (This is also, perhaps, a fundamental goal of free-market capitalism. It’s based on competition and maximizing profits. Nobody’s fault, and quite simply, it is human nature to want more….not always, but mass psychology and consumerism suggests this is the case for the majority). Since corporations are large companies, they provide substantial job markets for graduates. Therefore, this corporate responsibility carries huge implications for its employees, explained below. </p>
<p>I have heard MANY members of older generations state, on multiple occasions no less, that you simply need the degree to make yourself marketable, because it simply demonstrates your ability and willingness to learn (I seriously get sick of hearing this, especially now that I have a degree). This may have been the case in the past, but many companies no longer accept it as their responsibility to train individuals for the job. They want employees who can start on day one at maximum productivity. However, universities were never intended to provide “on-the-job” training. This creates a huge disconnect, where graduates cannot meet the “entry-level” requirements posed by companies. So how does this relate to corporate profits? Well, a new hire is often a fairly significant investment on the part of the company. They often contribute in a limited capacity for a given amount of time (this will vary obviously). With the advancement of technological software applications in nearly every field, entry level jobs are often dependent on an individual’s aptitude with a given technology. They are, more often than not, expected to be able to crunch numbers, organize data, model designs, etc. (Simply put, know how to use certain software). Basically, this is now a major portion of the “busy-work,” the entry-level grunt work. In order to reduce operating costs, many companies are choosing to outsource. They can “buy” workers from other countries to complete these tasks at pennies on the dollar compared to their American counterparts. While this may seem attractive for short-term profit, they may not have considered the effects. In this situation, neither the outsourced employee, nor the American reject, is exposed to the workings of the business. They do not gain experience in more advanced job responsibilities that come over time in a workplace. Thus, fewer employees are being developed for management or more skilled positions later. Therefore, we are facing an enormous shortage in skilled workers in a number of fields. (Here, I am speaking from an engineering prospective, as my degree is in civil engineering). Another point of interest is that multinational corporations (again, often the biggest companies with the most jobs to offer) are subject to double taxing in the United States. First, America has the highest corporate tax rate in the world. Then, when profits are distributed to shareholders at year end, they are subject to taxation a second time. It seems to me that this would encourage activity outside of the United States. (I welcome informed debate on this one as, admittedly, my knowledge base is somewhat lacking). </p>
<p>There is also a darker side to the story. It should be recognized that as companies have reduced labor costs to achieve profitability, some employees have been placing pressure on HR and management to hire in order to mitigate the increased workload. Employers then post jobs that have no intention of being filled. This inflates the number of jobs that are “available.” When asked why they hadn’t filled these positions, the common answer is that there were no qualified candidates. Not true.</p>
<p>**** ASIDE – I still encourage you to read it, but not necessary. ****
I want to make a quick side note here. Companies often have unrealistic expectations of interns as well. In two separate cases, I was expected to learn a computer program from scratch and contribute to a project. After my freshman year, I secured an internship in robotic systems development. This was my last choice out of all available departments, but they placed me there anyway. I was expected to know or learn C++ in order to look through some of the compiled programs. As someone with no experience in programming, this was hilariously futile, and I spent my days frustrated, pouring over C++ manuals going nowhere. I also didn’t receive any one-on-one instruction. During my second internship, I was expected to learn 3DS Max and help with some modeling and animation work. Again, I was given no instruction, and it was left to the other interns who had spent their last four summers self-teaching themselves the program to provide me with the background I needed. Maybe some of you have had better experiences with internships, but I am sure there are also some that can identify with what I’m saying. The result? Well, companies often look to internship experience to see if you developed the skills to meet entry-level requirements. So, everyone wants you to already know software applications, but training for some of these programs (usually hosted by the company that writes the software) can reach into the thousands of dollars. It would take very enterprising, dedicated individuals a lot of time to teach themselves.
**** END ASIDE ****</p>
<p>And this brings up another point. Companies desire to increase productivity by implementing state-of-the-art technologies that have been engineered in the last couple decades. However, training is lacking and a lot of the time, management has little appreciation for the intricacies of such software because they never used it. If they have, training courses were often paid for by the company they worked for. Talk about a disconnect! We are now seeing some universities implementing courses that tackle the software applications, but often the course material is not advanced enough to even come close to mimicking job requirements. </p>
<p>But the impact of technological advancement does not stop here. (Mind you this is not an anti-technology rant. When implemented correctly, and when employees have the proper training, it could greatly bolster productivity). As technologies are developed to address specific challenges, jobs become increasingly specialized (e.g. - how many different types of computer programming languages are there to complete different tasks?). “But wait!” you say. “This should create more jobs!” Possibly, but specialization requires young people to make decisions early, and without change of heart at any point in their college career, to have the time to fully develop the skills necessary for an entry level position. This is just simply not a realistic expectation. As a first-hand example, there are many divisions of civil engineering (structural, water resources, transportation, land development, geotechnical, etc.). Each division has its own set of design challenges, and as a result, different software apps have been developed that aid an engineer within each division. And if you think this is limited to engineers or software developers, think again. I have spoken to finance, mathematics, economy majors who wish they had training in database and other software (obviously aside from Microsoft Excel, which is pretty standard). And this is still not the full scope. Most of the time, as stated, training is limited both within the school curriculum (but then again, the school’s job is to provide the knowledge base, not the job skills), and on the job. Students are therefore left with the responsibility of learning the software on their own. Since software licenses are often of substantial cost, students can typically only use them while they are on the school’s academic license. But if a student has to work while going to school, in order to help cover increased education cost (or living expenses), good luck finding the time. And forget about a social life, unless you don’t mind sacrificing grades a bit. And we wonder why depression is on the rise amongst college students. </p>
<p>So what can you do to safeguard yourself against all this? Damn good question. Primarily, make sure you know what you want to do early in life, and pursue the hell out of it. Research entry level job requirements and start familiarizing yourself with what you need to know. And stay current. Software is constantly evolving. I think this is your best bet. Keep in mind, however, that some factors are simply out of your control, and don’t get discouraged. The importance of networking should also not be understated. My one regret is that I didn’t network as well as I should have. I’m not a particularly social person, and it’s always been the hardest part of the whole professional development scheme for me. But it is important all the same. And networking isn’t just a psychological thing. Of course you want to make yourself likeable and find someone in a company to vouch for you, but it’s more than that. It’s essentially a pre-screening that carries important productivity implications as well. The happier people are in the workplace, the more productive they will be. A lot of this happiness comes from good relationships with co-workers. If the company thinks you will be liked, and you have the recommendation of a current employee, they will be more likely to hire you on the grounds that you will get along with other employees. Overall, this would contribute to a better workplace environment, so stay positive and meet people in your field! </p>
<p>Out of humor, I thought I’d include some of the worst advice I’ve seen with regards to weathering the job market and staying ahead of outsourcing:</p>
<p>1) Develop invaluable skills, such as software skills (refer to my previous arguments), bilingualism (This is simply not practical. It takes a huge amount of time.), or ability with rare equipment (if it’s rare, you probably don’t have access to it, and it’s probably not in high demand).
2) Move quickly up the ladder. (First, this assumes that you have a job. Second, it offers no practical advice on how to do this. Third, your company better be opening higher level jobs for this to happen.)
3) Try to get hired by smaller companies. (Honestly, this one is okay. But it’s far more difficult to find small companies that will hire entry level, or that are even looking for new hires, because they simply don’t have the budget. Especially in economic times when business is slow going.)</p>
<p>As far as what needs to be done, there are some policy changes that need to take place, clearly, but I have no idea what, and I’m not sure many people do. We need to better clarify the role businesses and universities should take in professional development of employees. Universities have career resource centers as well, but with budget cuts, these are often the first departments to go, despite the dramatic increase in tuition (which more than doubled between my freshman and senior years). Additionally, I have heard and agree with the following argument: we need to reshape our perception of trade jobs. Generally, they don’t get the same kind of respect that jobs requiring degrees do, but they play just as important roles in our society. And since nearly half of all college graduates are working jobs that don’t require a degree now, wouldn’t it be beneficial to encourage some of our students to simply pursue these kinds of jobs outright rather than pursue a four year degree and fall into debt first? At the same time however, a hurting economy limits the number of these jobs available as well. Not to mention technological advancements, specifically in the field of robotics, have done away with some of these jobs, as well as jobs in unskilled labor. I read an interesting article on the morality of technology (with respect to its reducing the number of available jobs), but alas I cannot find it now. I didn’t agree with quite a bit of it, but the fact still remains that workers are being displaced by robots. Look at assembly lines, and developments in machining. Look at the DIY checkouts in your local stores, or increased use of robotics in processing plants. These are just a few examples. Ultimately, the degree of accuracy and efficiency that robots provide can override human intuition where it may not be necessary. Robotics will never displace humans entirely, but with increasingly advanced innovation, we are still seeing the effects. It takes far fewer people to troubleshoot the robots than to make up the man hours that these robots work. Maybe the growth we experienced is simply unsustainable. I don’t really know, but something needs to be done. We have a lot of hard questions that need to be answered, and it’s time we find those answers.</p>