2nd Dec 2009 by Gary
Law requires a company hiring an employee to have them covered with workers compensation insurance. As far as needing what is usually called 'workman's comp,' it's definitely a big plus, as it covers the medical costs of an injury suffered from doing your job.
You would want to ask your employer to be sure they're covering you with workman's comp, as there are some who don't, and in case of injury, won't pay out the medical costs associated with the injury. If you are injured on the job and your employer doesn't have workman's comp for you, a number of states have uninsured employers funds to help pay out benefits for those workers. These are rare cases, but they can and do happen. For an employer, you do need to check with your particular state government to see what the requirements and costs of workers' compensation are, as you can receive a stiff fine if you aren't in compliance with the law.
Like This Answer?
This answer is the subjective opinion of the writer and not of FinancialAdvisory.com