Why Did Employers Start Offering Health Insurance?
- 21750 POINTSview profileJim WinklerCEO/Owner, Winkler Financial Group, Houston, TexasThat is a great question! Employers reached a point where just offering a decent wage was often not enough to attract and keep a good workforce. Employees would leave for whatever job paid more. So employers added benefits to go along with the wages, in an effort to sweeten the pot. Paid vacations, and health insurance were no- brainers - if you are rested, happy and healthy, you will work better, and hopefully stay working longer. Good health insurance is still a major draw for many employees, especially those with kids. Thanks for asking!Answered on October 13, 2014flag this answer
Did you find these answers helpful?
Yes
No
Go!
Add Your Answer To This Question
You must be logged in to add your answer.