downtoearth-subscribe

Wal-Mart backs health-care mandate

In a major break with most other large companies, Wal-Mart Stores Inc. Tuesday told the White House that it supports requiring employers to provide health insurance to workers, a centerpiece of President Barack Obama's effort to provide near-universal coverage to Americans.