Answer:
Most companies are not required by law to offer health insurance to their employees, however many do.
When an employer offers or provides health plans, it should mention coverage options (including health, disability, dental, and life insurance).