What exactly will I gain from a business major?

<p>I'm not really sure what I want to do as a career, but I know for sure that I do not want to be in the corporate world. So basically, a normal business degree and the normal route one would take with one is not for me. </p>

<p>But I'm still considering doing a double major in business and a liberal arts field (anthropology, urban studies, or something similar). I'll probably be running a business someday in whatever specific field I go into, so I thought maybe having two degrees, and one of them being a business one, would help me run a company when I get to that point.</p>

<p>Any thoughts? Will a business degree teach me how to run a business, or how to work in a business/economic company?</p>

<p>It would be a waste of time. By the time you start your own business, you will have forgot most of what you had learned…most of which wouldn’t be applicable anyways.</p>

<p>VectorWega was obviously kidding–or being sarcastic about you asking a question like that.</p>

<p>Obviously, if you major in business, then you are going to learn about business–and it will be applicable to any career you go into. If you plan to start your own company, then majoring in entrepreneurship would be the best of the available business majors for you. Entrepreneurship is the study of how to start and run one’s own business.</p>

<p>It could help if you’re moving into management positions in your field, but 20 years down the road and starting your own business? VW is right.</p>

<p>if you dont know the answer to that question, dont waste your time majoring in business…good luck in economics</p>

<p>working on a farm is always an option.</p>

<p>If you want to learn how to run a company then I suggest a concetration in entrepreneurship.</p>