A recent article published in Science Magazine has found that a prolific algorithm used to recommend sick patients for care programs was significantly less likely to refer black patients than it was to recommend white patients.

 

The algorithm referred people for programs based on their annual healthcare cost. Though high annual healthcare cost is often related to a need for extra care, black patients have on average less access to healthcare than white patients. As a result, the annual healthcare costs for black patients was generally lower than it was for white patients, and thus led to the algorithm referring white patients for extra care more often.

 

Alarmingly, the data analyzed in the study also indicated that black patients had a greater prevalence of conditions such as diabetes, kidney failure, and anemia, and were in this respect sicker, on average, than white patients. Despite this, the researchers found that only 17.7% of patients that the algorithm recommended to extra care were black. Additionally, it was calculated that black patients would comprise 46.5% of all extra care recommendations in a racially unbiased system.

 

To address this issue, it was reported by Nature that lead researcher Zach Obermeyer is working without pay with the algorithm developer Optum to find an alternative metric to base extra care recommendation off of. Their efforts thus far have been self-reported to have reduced the racial bias of the algorithm by 84%.

 

In contrast, however, Optum spokesman Tyler Mason referred to the research findings as “misleading” and told the Associated Press that “the cost model is just one of many data elements intended to be used to select patients for clinical engagement programs, including, most importantly, the doctor’s expertise.”

 

Though the bias of the algorithm under inquiry has been reported as being significantly reduced, the issue is systemic according to Obermeyer. “This is not a problem with one algorithm, or one company – it’s a problem with how our entire system approaches this problem,” he says. “How do you work around the bias and injustice that is inherent in […] society?”