In the Winter Months. Florida has the Best Weather in the USA Tourist season in South Florida, USA, starts in November and ends in April of every single year -- again and again and again -- as tourists flock to the south to embrace the best weather in all of America!