Is Health Insurance Mandatory? Investing News admin February 7, 2023 0 75 1 minute read Health insurance is not mandatory at the federal level. It used to be mandatory to have health insurance, but that changed in 2019. Here’s what happened.