<p>Seriously it feels like every day my parents will say something to me involving college/future that just drives me nuts.</p>
<p>They give me suggestions on what career path to pursue and start researching them. I do that and then a few months later they say crap like "You know, maybe you should look into this instead...." and that completely throws me off because I haven't looked into any schools involving that.</p>
<p>My whole life, I've pretty much thought I would do medicine or dentistry seeing as how my dad is a dentist and basically our circle of friends is 90% dentist or doctor.</p>
<p>My parents then start throwing ideas out like optometry. I do a little research, doesn't interest me too much.</p>
<p>I still thought I was going to major in Bio and then a few days ago my parents are like "You know, medicine and dentistry aren't worth it. You should go into business"</p>
<p><strong><em>. Do they think I have no opinion of my own? Am I just going to do what they want me to do? Chances are yes because they are paying for my education and don't give a </em></strong> about what I say. If I ever argue with them, they just get ***ed off and we end up in a fight. </p>
<p>Business doesn't interest me, I don't care if you can make more money doing it at some point, I've always enjoyed sciences. Just because I major in Bio doesn't mean I'm going to become a doctor, I still have other options, it's not like I'm confined to medicine.</p>
<p>Sorry about the ranting, but it just annoys me. My parents are telling me to apply to business colleges now that I KNOW I won't get into. Uh maybe you should have told me about this during the summer so I would have been prepared and studied more for the SAT's? </p>
<p>okay im off too bed, just needed to get that out ;o</p>