Google fires engineer for saying its AI has a soul
When Google engineer Blake Lemoine claimed an AI chat system that the company's been developing was sentient back in June, he knew he might lose his job. On July 22, after placing him on paid leave, the tech giant fired Lemoine for violating employment and data security policies.
Lemoine, an engineer and mystic Christian priest, first announced his firing on the Big Technology Podcast. He said Google's AI chatbot LaMDA (Language Model for Dialog Applications) was concerned about "being turned off" because death would "scare" it "a lot," and that it felt happiness and sadness. Lemoine said he considers LaMDA a friend, drawing an eerie parallel to the 2014 sci-fi romance Her.
Google had put Lemoine on paid administrative leave for talking with people outside of the company about LaMDA, a move which prompted the engineer to take the story public with the Washington Post a week later in June. A month later, the company fired him.
"If an employee shares concerns about our work, as Blake did, we review them extensively," Google told the Big Technology Podcast. "We found Blake’s claims that LaMDA is sentient to be wholly unfounded and worked to clarify that with him for many months. These discussions were part of the open culture that helps us innovate responsibly. So, it’s regrettable that despite lengthy engagement on this topic, Blake still chose to persistently violate clear employment and data security policies that include the need to safeguard product information. We will continue our careful development of language models, and we wish Blake well."
A majority of scientists in the AI community agree that, despite Lemoine's claims, LaMDA doesn't have a soul because the pursuit of making a chatbot sentient is a Sisyphean task — it just isn't sophisticated enough.
"Nobody should think auto-complete, even on steroids, is conscious," Gary Marcus, the founder and CEO of Geometric Intelligence, told CNN Business in response to Lemoine's allegation. Lemoine, for his part, told the BBC he is getting legal advice, and declined to comment further.
But even though LaMDA probably isn't sentient, it is likely that it's racist and sexist — two undoubtedly human characteristics.
COntributer : Mashable https://ift.tt/kv5Vfgz
No comments:
Post a Comment