<p>Well I guess it depends on your goals. If you are going to Med school just because you know “doctors make big money” then thats great for you! However, I would prefer if my doctor became a doctor so that he could, oh I don’t know, “save lives, help people, etc”. Of course everyone thinks about future incomes and whatnot but many of the social science degrees are designed to bring more well-rounded perspectives into the world. Despite what you might believe social sciences are just as important to this world as math. With a social science degree you could work for government agencies and earn a hefty salary. You could become a teacher at a college level and make 150k easily. However, that shouldn’t be the only reason you went to college and pursued your degree. Yes, we all go to college so that we don’t have to work at baskin robbins forever, but I truly feel that most go to college because they have a focus of interest in a field. Besides, if you are just worried about money then it doesn’t matter what degree you get. Within our capitalist nation you could be a philo major and head a bank one day, its all relative. Or as the poster above said, a lot of social science majors prep lawyers and great grad school students. Stop being so elitist.</p>