Why using AI in policing decisions risks race and class bias

TwitterFacebook

AI is rocking the world of policing — and the consequences are still unclear. 

British police are poised to go live with a predictive artificial intelligence system that will help officers assess the risk of suspects re-offending. 

It's not Minority Report (yet) but certainly sounds scary. Just like the evil AIs in the movies, this tool has an acronym: HART, which stands for Harm Assessment Risk Tool, and it's going live in Durham after a long trial. 

The system, which classifies suspects at a low, medium, or high risk of committing a future offence, was tested in 2013 using data that Durham police gathered from 2008 to 2012.  Read more...

More about Artificial Intelligence, Ai, Custody, Durham Police, and Tech

COntributer : Mashable http://ift.tt/2r9RjGe

Why using AI in policing decisions risks race and class bias Why using AI in policing decisions risks race and class bias Reviewed by mimisabreena on Friday, May 12, 2017 Rating: 5

No comments:

Sponsor

Powered by Blogger.