I live in New England, and everyone and their mother thinks Florida is the greatest place on planet earth. It is humid, flat, sticky, but it has a beach and warm weather. I understand that it is a relatively short flight, but with the way prices are to Florida these days it’s not much different than going to a Caribbean island or even Spain sometimes!
I just don’t get it and never will, please enlighten me if you are a Florida lover.