<p>now i know you think this has been asked many times, but my advisor told me that these days its really necessary to major in a science, and that there used to be a time when med schools wanted diversity and liked people who majored in music/history/non-sciences, but now today, the trend has again been reversed back to the time where med schools want to see a science major</p>
<p>is this ridiculous or is there truth to this? i go to a top university, so my advisor is not some moron, he must be getting this argument from somwhere and not pulling it out of his ass, so im a little confused</p>