I just graduated from college in May and cant seem to get a job in my field which is advertising. In the mean time I am trying to gather some information and put some articles together. I would like to hear the voice of college students or recent grads. Anyone who has thoughts on the following questions please feel free to elaborate. Thanks.
Do we need a degree to tell us we are educated?
Did college forget to inform us that an internship was a MUST not an OPTION?
Do we need a degree to tell us we are educated?
Do woman have a harder time finding a job than men?
Is college one of lifes greatest investments or a mistake?