<p>Don't get me wrong, I'm, with a full heart, planning on attending the best university I possibly am able to, but I'm curious, how important do you really think it is in life? In any aspect what so ever? Why and how detrimental to our lives is it?</p>
<p>Is this a serious question?</p>
<p>Yes. </p>
<p>I was thinking about it. In terms of money and the traditional view of success, but what if that isn’t the kind of person you are, (I am, but just one of those odd thought processes…) but are yet still highly driven, just not in the way college normally is?</p>
<p>our society needs lots of people to work lots of job, and not all of them require a college education. plumbers, hair dressers, auto machanics, construction workers…i could go on and on. these are all respectable occupations, and they do not require a college degree. i would not want to live in a world without plumbers and hair dressers, and i think they are vital to our society.</p>
<p>
</p>
<p>Not completely true, you need some college education to be a hair dresser and an auto mechanic. There are even degrees now for construction.</p>
<p>If you’re the kind of person who is highly motivated and can learn on her own, then you don’t need college. </p>
<p>College is just a convenient place to learn things. If you can do this by yourself (keeping in mind that you’ll need some sort of educational agenda and resources to study with) then you’re set for life.</p>
<p>However, most people won’t hire you for a position without a college degree, so if you were to ever take this course, you’d have to be a self-starting, incredibly lucky, private business-minded person.</p>
<p>Actually, my dad is an auto mechanic who knows everything about cars. He didn’t go to college at all.</p>
<p>Of course, he doesn’t make that much money, but he’s been doing it for over 30 years so he can’t really pack up and start something new.</p>