Is Health Insurance Mandatory In Florida?
- 2275 POINTSview profileBill LougheadPresident, SummitMedigap.com, CO, FL, GA, MI, NC, SC & TXYes, health insurance is mandatory in the state of Florida or risk paying a fine. The affordable care act states that by January 1, 2014 people are required to have health insurance or pay a fine. This rule is not just for Florida but for all states. An independent insurance agent should be able to show you plans from most of the major carriers and help you pick that plan that fits your needs. Several independent agencies (like ours) have a website that people can instantly compare health insurance plans online and then ask questions to experienced licensed agents over the phone.Answered on November 23, 2013flag this answer
Did you find these answers helpful?
Yes
No
Go!
Add Your Answer To This Question
You must be logged in to add your answer.