(June 6, 2016 at 5:01 pm)Maelstrom Wrote: Why is it that Florida is never particularly considered The South when Florida is literally further south than every other place that considers itself Southern.
Eh Texas is sometimes the west and sometimes the south even though we are in the middle... It is just a culture thing. In Texas we have a southern way of life while I assume from my very brief trip to Disney World y'all might not have biscuits and gravy ;p Naw I actually don't know what the culture life is there.