Algorithms Can Be Racist, But At Least They Can Be Fixed

Octavio Jones | Times/Tampa Bay Times via ZUMA

Fight disinformation: Sign up for the free Mother Jones Daily newsletter and follow the news that matters.

The Washington Post reports today on a piece of research that detected a racial bias in a widely-used algorithm that predicts which patients need extra medical attention:

The algorithm wasn’t intentionally racist — in fact, it specifically excluded race. Instead, to identify patients who would benefit from more medical support, the algorithm used a seemingly race-blind metric: how much patients would cost the health-care system in the future. But cost isn’t a race-neutral measure of health-care need. Black patients incurred about $1,800 less in medical costs per year than white patients with the same number of chronic conditions; thus the algorithm scored white patients as equally at risk of future health problems as black patients who had many more diseases.

There’s both good news and bad news here. The bad new is obvious: it’s hard to know which seemingly race-neutral metrics might, in fact, rely indirectly on race. In this case, by using dollar amounts, the algorithm favored white patients who are generally more affluent and spend more on health care to begin with.

The good news is a little more subtle, but still genuinely good. The alternative to algorithms, of course, is human judgment. But human judgment also tends to be racist, even among those who have only the best intentions. The difference is that it’s really hard—close to impossible in many cases—to change human behavior in the short or medium term. So the racism continues even if we know it’s there.

With a computer algorithm, however, careful study can often identify biases—and once those biases are uncovered, they can be fixed. In this case, developers are already at work on a better algorithm. Compare that to the years and years it would take to fight human racism with bias training and diversity programs and so forth, with no guarantee even then of success.

So two cheers for the algorithm revolution. Digital algorithms aren’t perfect, but they’re a damn sight better than most wetware algorithms.

Fact:

Mother Jones was founded as a nonprofit in 1976 because we knew corporations and billionaires wouldn't fund the type of hard-hitting journalism we set out to do.

Today, reader support makes up about two-thirds of our budget, allows us to dig deep on stories that matter, and lets us keep our reporting free for everyone. If you value what you get from Mother Jones, please join us with a tax-deductible donation today so we can keep on doing the type of journalism 2022 demands.

payment methods

Fact:

Today, reader support makes up about two-thirds of our budget, allows us to dig deep on stories that matter, and lets us keep our reporting free for everyone. If you value what you get from Mother Jones, please join us with a tax-deductible donation today so we can keep on doing the type of journalism 2022 demands.

payment methods

We Recommend

Latest

Sign up for our free newsletter

Subscribe to the Mother Jones Daily to have our top stories delivered directly to your inbox.

Get our award-winning magazine

Save big on a full year of investigations, ideas, and insights.

Subscribe

Support our journalism

Help Mother Jones' reporters dig deep with a tax-deductible donation.

Donate