When you are injured, it is important to let your employer know immediately. If you do not inform your employer, they will be unaware of the injury and any subsequent treatment. It is also important to keep your employer informed of any changes in your condition so they can help you get back to work.
If you do not inform your employer, later on they may find out from a third party and this could lead to a lawsuit. Informing your employer is not just about getting the job back, it is also about improving their safety practices by informing them of changes in your condition and ensuring they are in compliance with health and safety regulations. If you are unsure whether or not you should inform your employer, consult with an experienced workers’ compensation lawyer to learn more about your rights and options.