<p>Florida may be geographically be further South than other states, but culturally it is not part of the Deep South. And since by this question you were alluding to cultural influences and not geographic ones (presumably), for the purposes of your question UF is certainly not a Deep South university. I currently live in New York (as a Southern transplant) and while the other states on your list are considered Deep South by New Yorkers and Northeasterners in general, Florida is decidedly not.</p>
<p>I have to agree that this question is a little ridiculous. All of the big state universities on your list are well-known, reputable universities that send thousands of graduates to grad, med, and law schools every year. Yes, it’s absolutely fine to attend college in any of those states and still have a life afterwards. I mean, after all, there are grad, law, and med schools and doctors, attorneys, and professors in the South too. We’re not all peach farmers or rodeo cowboys.</p>