November 16, 2009 6:03 PM

Should employers be required to offer health insurance?

Requiring employers to offer most workers health insurance has long been seen as a crucial piece of Democratic efforts to overhaul the nation's health care system, but legislation that the Senate's expected to consider soon is unlikely to include any such mandate.

Related content