Loading...
Judul : Why using AI in policing decisions risks race and class bias
link : Why using AI in policing decisions risks race and class bias
Why using AI in policing decisions risks race and class bias
AI is rocking the world of policing — and the consequences are still unclear.
British police are poised to go live with a predictive artificial intelligence system that will help officers assess the risk of suspects re-offending.
It's not Minority Report (yet) but certainly sounds scary. Just like the evil AIs in the movies, this tool has an acronym: HART, which stands for Harm Assessment Risk Tool, and it's going live in Durham after a long trial.
The system, which classifies suspects at a low, medium, or high risk of committing a future offence, was tested in 2013 using data that Durham police gathered from 2008 to 2012. Read more...
More about Artificial Intelligence, Ai, Custody, Durham Police, and Tech✍ Sumber Pautan : ☕ Mashable
Kredit kepada pemilik laman asal dan sekira berminat untuk meneruskan bacaan sila klik link atau copy paste ke web server : http://ift.tt/2pGxu8i
(✿◠‿◠)✌ Mukah Pages : Pautan Viral Media Sensasi Tanpa Henti. Memuat-naik beraneka jenis artikel menarik setiap detik tanpa henti dari pelbagai sumber. Selamat membaca dan jangan lupa untuk 👍 Like & 💕 Share di media sosial anda!
Demikianlah Artikel Why using AI in policing decisions risks race and class bias
Sekianlah artikel Why using AI in policing decisions risks race and class bias kali ini, mudah-mudahan bisa memberi manfaat untuk anda semua. baiklah, sampai jumpa di postingan artikel lainnya.
Anda sekarang membaca artikel Why using AI in policing decisions risks race and class bias dengan alamat link https://malaysiaonthehome.blogspot.com/2017/05/why-using-ai-in-policing-decisions.html
0 Response to "Why using AI in policing decisions risks race and class bias"
Catat Ulasan