Predictive AI is too flawed — both technically and ethically — to prevent another El Paso or Dayton. BKC faculty associate Desmond Patton emphasizes that current AI tools tend to identify the language of African American and Latinx people as gang-involved or otherwise threatening, but consistently miss the posts of white mass murderers.
“Until we confront and adequately address bias in these systems, I don’t feel comfortable with them being used as a tool for prevention,” Patton says.
You might also like
Stay in touch
Subscribe to our email list for the latest news, information, and commentary from the Berkman Klein Center and our community.
Subscribe