When Did It Become Mandatory For Car Insurance?
- 7647 POINTSview profileMark Bartlett CLCSBranch Owner, TWFG Insurance Services, Fremont California and the Greater Bay Area Representing Dozens of Insurance CarriersThe first actual mandatory auto insurance was in 1927 in the state of Massachusetts. Shortly after most other states began mandating auto insurance. Allot has changed in the laws over the years as it relates to financial responsibility but mandatory auto insurance has been around for a long time. Insurance is state regulated, not federal. There is a couple of states still do not mandate auto insurance if you can demonstrate the financial means to cover the minimum amounts for bodily injury, property damage and liability in that state. Of course you might as well put a sign around your neck that says please sue me, I have money.Answered on June 12, 2013flag this answer
Did you find these answers helpful?
Yes
No
Go!
Add Your Answer To This Question
You must be logged in to add your answer.