I have never heard of Florida getting a bad rap outside of the OG. New Jersey yes. Some rural states for being backward, and Alabama et al.. well you know.
But Florida (which I have visited a couple of times over the years) is known internationally for beaches, Seinfeld's parents and other seniors, hot women with bolt ons, Miami Vice (yes, I'm old) and to some some Scarface type Cuban stuff I guess.
What's with the FLorida hate? Does it really suck?