Posted by : Brij Bhushan Thursday 11 February 2021


Artificial intelligence often doesn’t work the same for Black people as it does for white people. Sometimes it’s a matter of vastly different user experiences, like when voice assistants struggle to understand words from Black voices. Other times, such as when cancer detection systems don’t account for race, it’s a matter of life and death. So who’s fault is it? Setting aside intentionally malicious uses of AI software, such as facial recognition and crime prediction systems for law enforcement, we can assume the problem is with bias. When we think about bias in AI, we’re usually reminded of incidents such as…

This story continues at The Next Web

Leave a Reply

Subscribe to Posts | Subscribe to Comments

Popular Post

Followers

- Copyright © 2013 FB EDucator - Powered by Blogger-