Who’s More Racist: You or an Algorithm?

Let our journalists help you make sense of the noise: Subscribe to the Mother Jones Daily newsletter and get a recap of news that matters.

For some reason there’s a sudden interest in what seems like a minor topic: whether machine learning is biased. The basic argument is that since computer programs are written by humans and trained on human data, they inevitably inherit human biases along the way. Therefore, we need to…

…do something. But I’m not sure what. Nobody ever seems to suggest much of anything.

I suspect there’s a reason for that: the problem of AI bias is far less pervasive than critics suggest. It happens, of course. But the thing to keep in mind about AI and machine learning is that they don’t have to be perfect to be useful. They just have to be better than humans. As long as an algorithm is no more biased than the average person—and I’ve never heard of an example where one is—then it’s a useful thing.

But there’s more than that to say in favor of automation: even if an algorithm is biased, it’s far easier to correct than it is in a human. For example, if you’re concerned about why an algorithm is making its decisions, you can program it to tell you. It will be 100 percent honest and 100 percent non-defensive about this. Humans, by contrast, frequently don’t even know why they make particular decisions, and if they do they’ll often lie about it.

Likewise, if you’re concerned that a training set has introduced bias, you can retrain an algorithm. Once you have a goal in mind, this is fairly quick and painless. Humans, by contrast, are all but impossible to retrain once they become adults.

It’s useful to be aware of these things, and to insist that algo designers incorporate bias antigens from the start. Algorithms of any complexity shouldn’t be black boxes. They shouldn’t be hard to retrain. They should be written to hunt for possible biases and report them. This isn’t trivial, but it’s hardly the biggest programming challenge in the world. Given all this, the odds that machine intelligence will end up being more biased than human beings is, to anyone who’s aware of how biased human beings are, pretty laughable.

DOES IT FEEL LIKE POLITICS IS AT A BREAKING POINT?

Headshot of Editor in Chief of Mother Jones, Clara Jeffery

It sure feels that way to me, and here at Mother Jones, we’ve been thinking a lot about what journalism needs to do differently, and how we can have the biggest impact.

We kept coming back to one word: corruption. Democracy and the rule of law being undermined by those with wealth and power for their own gain. So we're launching an ambitious Mother Jones Corruption Project to do deep, time-intensive reporting on systemic corruption, and asking the MoJo community to help crowdfund it.

We aim to hire, build a team, and give them the time and space needed to understand how we got here and how we might get out. We want to dig into the forces and decisions that have allowed massive conflicts of interest, influence peddling, and win-at-all-costs politics to flourish.

It's unlike anything we've done, and we have seed funding to get started, but we're looking to raise $500,000 from readers by July when we'll be making key budgeting decisions—and the more resources we have by then, the deeper we can dig. If our plan sounds good to you, please help kickstart it with a tax-deductible donation today.

Thanks for reading—whether or not you can pitch in today, or ever, I'm glad you're with us.

Signed by Clara Jeffery

Clara Jeffery, Editor-in-Chief

payment methods

We Recommend

Latest