See more of the story

New York regulators are calling on Minnetonka-based UnitedHealth Group to either stop using or show there's no problem with a company-made algorithm that researchers say exhibited significant racial bias in a study.

This month in the journal Science, researchers reported that an algorithm UnitedHealth Group sells to hospitals for assessing the health risks of patients assigned comparable risk scores to white patients and black patients when the black patients were considerably sicker.

That's a problem, researchers reported, because white patients would be more likely as a function of the risk scores to receive care management services intended to control costs while improving patient outcomes. They said the problem likely exists in other data analytics programs, too.

"We call on you to immediately investigate these reports and demonstrate that this algorithm is not racially discriminatory or to cease using Impact Pro (or any other data analytics program) if you cannot demonstrate that it does not rely on racial biases or perpetuate racially disparate impacts," wrote Linda Lacewell, superintendent of the New York State Department of Financial Services, and Dr. Howard Zucker, the state's health commissioner, in an Oct. 25 letter to UnitedHealth Group chief executive David Wichmann.

Researchers said United­Health Group's product, Impact Pro, is one of the largest commercial risk-prediction tools used by health care providers. In a statement, UnitedHealth Group said researchers validated that "the cost model within Impact Pro was highly predictive of cost, which is what it was designed to do."

The program uses analy­tics from more than 600 clinical measures, the company says, to identify gaps in care.

"These gaps, often caused by social determinants of care and other socio-economic factors, can then be addressed by the health systems and doctors to ensure people, especially in underserved populations, get effective, individualized care," the company said.

UnitedHealth Group is Minnesota's largest company. It operates UnitedHealthcare, which is the nation's largest health insurer, and Optum, a fast-growing division for health care services. Impact Pro is sold by Optum, based in Eden Prairie.

In study materials, researchers wrote that hospitals in recent years have started using programs like Impact Pro to help them handle new contracts with health insurers. Those contracts pass financial risk for the quality and cost of care to doctors and hospitals in order to "align the incentives of hospitals with the incentives of society, around reducing costs," researchers wrote.

Impact Pro helps hospitals target patients who are most likely to benefit from high-risk case management services, researchers wrote. The assumption is that the cost of a patient's past health care use will help predict future health care needs.

But there's no adjustment for well-known factors in the health care system, researchers say, that help explain why black patients often get less treatment for health problems.

"We show that a widely used algorithm, typical of this industrywide approach and affecting millions of patients, exhibits significant racial bias: At a given risk score, black patients are considerably sicker than white patients," researchers wrote.

"The bias arises because the algorithm predicts health care costs rather than illness, but unequal access to care means that we spend less money caring for black patients than for white patients," they added. "Thus, despite health care cost appearing to be an effective proxy for health by some measures of predictive accuracy, large racial biases arise."

Remedying the disparity would mean more black patients might be directed toward help through the care management programs, researchers wrote. One solution, they say, is for algorithms to look at more than just future cost when assessing which patients might benefit most from high-risk case management services. Doing so in one example, researchers wrote, generated "an 84% reduction in bias."

"It should be emphasized that this algorithm is not unique," researchers wrote. "Rather, it is emblematic of a generalized approach to risk prediction in the health sector."

In their letter, New York regulators said the research showed how the Optum data analytics program "significantly underestimates health needs for black patients."

"By relying on historic spending to triage and diagnose current patients, your algorithm appears to inherently prioritize white patients who have had greater access to healthcare than black patients," the regulators wrote. "This compounds the already-unacceptable racial biases that black patients experience, and reliance on such algorithms appears to effectively codify racial discrimination as health providers' and insurers' policy."

Christopher Snowbeck • 612-673-4744 Twitter: @chrissnowbeck