Everything that's wrong with that study which used AI to 'identify sexual orientation'
Advancements in artificial intelligence can be extremely worrying, especially when there are some pretty serious intimate and privacy issues at stake.
A study from Stanford University, first reported in the Economist, has raised a controversy after claiming AI can deduce whether people are gay or straight by analysing images of a gay person and a straight person side by side.
LGBTQ advocacy groups and privacy organisations have slammed the report as "junk science" and called it "dangerous and flawed" because of a clear lack of representation, racial bias and reducing the sexuality spectrum to a binary. Read more...
More about Artificial Intelligence, Lgbtq, Gay, Straight, and Facial RecognitionCOntributer : Mashable http://ift.tt/2xrvMiJ
Everything that's wrong with that study which used AI to 'identify sexual orientation'
Reviewed by mimisabreena
on
Tuesday, September 12, 2017
Rating:
No comments:
Post a Comment