Friday, April 19, 2024
Outlook.com
Outlook India
Outlook Business

Can A Machine Be Racist? AI Shows Troubling Signs Of Bias, But There Are Reasons For Optimism

The social worker returned frequently with referrals to mental health programs, violence prevention programs, job training programs, and so forth. The police also returned frequently – to remind him that he was being watched

Can A Machine Be Racist? AI Shows Troubling Signs Of Bias, But There Are Reasons For Optimism
POSTED ON March 07, 2023 3:19 PM

One day in mid-2013, four people, including two police officers and a social worker, arrived unannounced at the home of Chicago resident Robert McDaniel.

McDaniel had only ever had minor run-ins with the law – street gambling, marijuana possession, nothing even remotely violent. But his visitors informed him that a computer program had determined that the person living at his address was unusually likely to be involved in a future shooting.

Perhaps he would be the perpetrator, perhaps the victim. The computer wasn’t sure. But due to something called “predictive policing”, the social worker and the police would be visiting him on a regular basis.

McDaniel was not enthusiastic about either prospect, but the computer had made its decision, so they were offers he could not refuse.

The social worker returned frequently with referrals to mental health programs, violence prevention programs, job training programs, and so forth. The police also returned frequently – to remind him that he was being watched.

The official attention did not go unnoticed in McDaniel’s neighbourhood. Rumours spread that he was a police informant. In 2017, those rumours led to him being shot. In 2020, it happened again.

Thus, in a bizarre sense, the computer’s prediction could be said to have caused the tragedy it claimed to predict. Indeed, it could be said to have caused it twice.

Racist Machines?

We would not be wrong to interpret McDaniel’s story as a Kafkaesque nightmare about a man caught in an inexorable bureaucratic machine, or a Faustian parable about what happens when technology escapes the bounds of human control.

But according to the professor of data journalism and accomplished computer scientist Meredith Broussard, it is also, and perhaps more importantly, a story about racism.

For when the police arrived at his door in 2013, Robert McDaniel was not just any man. He was a young Black man living in a neighbourhood that had been shaped by a shameful history of racist redlining. The neighbourhood was, as a result, the home to a disproportionate level of both criminal violence and police surveillance. 

McDaniel was thus all but destined to become the target of the kind of technologically driven predictive policing that led to his being shot.

And, Broussard maintains, what happened to Robert McDaniel is but one example of the many ways that AI is augmenting and exacerbating the inequalities that characterise modern social life.

Do not worry that machines will rise up, take power, and create a completely new world, Broussard argues. Worry that they will silently reproduce and reinforce the world that already exists.
 

  • Related Articles

    GoTo also promoted Triveni Rabindraraj to the head of sales, in India to solidify its commitment to accelerating growth and driving value for clients’ IT transformation

    GoTo Records Strong Momentum In India with 20% Customer Growth

    Meta laid off nearly 11,000 employees last year as a part of its cost-cutting measures

    Meta To Layoff Thousands Of Employees: Report

    Musk would be able to challenge the decision at an administrative court in the Turkish capital of Ankara within 60 days of receipt of formal notification, said the board which is tasked with ensuring...

    Turkish Watchdog Fines Elon Musk Over Twitter Takeover