AI experts doubt Amazon's new Halo wearable can accurately judge the emotion in your voice, and worry about the privacy risks

Halo App and Halo Band
Amazon's wearable "Halo" wristband and accompanying app.
  • Amazon launched its new wearable, Halo, on Thursday. It comes with a feature called "Amazon Tone" which analyzes the emotion in your voice to help users "better understand how they may sound to others."
  • Amazon says Tone uses machine learning to analyze the users' voice and tell them how they sound throughout the day.
  • AI experts told Business Insider it's bold of Amazon to claim an algorithm is capable of accurately interpreting the emotion in someone's voice.
  • They also said the feature raises some troubling privacy problems.
  • Visit Business Insider's homepage for more stories.

Amazon on Thursday launched Amazon Halo, a wearable band to compete with FitBit and Apple Watch. Like its competitors, Halo can track heart rate and sleep patterns, but it's also looking to differentiate itself with a peculiar feature: judging your emotional state from your tone of voice.

"Amazon Tone" claims to tell you how you sound to other people. It uses "machine learning to analyze energy and positivity in a customer's voice so they can better understand how they may sound to others, helping improve their communication and relationships," Amazon's press release for Halo reads.

To give an example, Amazon's chief medical officer Maulik Majmudar said Tone might give you feedback such as: "In the morning you sounded calm, delighted, and warm." According to Majmudar, Tone analyzes vocal qualities like your "pitch, intensity, tempo, and rhythm" to tell you how it thinks you sound to other people.

Experts that Business Insider spoke to are dubious that an algorithm could accurately analyze something as complex as human emotion — and they are also worried that Tone data could end up with third parties.

Experts doubt an algorithm can capture the nuance of human speech

"I have my doubts that current technology is able to decipher the very complex human code of communication and the inner workings of emotion," said Dr Sandra Wachter, associate professor in AI ethics at the University of Oxford.

"How we use our voice and language is greatly impacted by social expectation, culture and customs. Expecting an algorithm to be able to read and understand all of those subtleties seems more like an aspirational endeavour," she said.

Wachter added that claiming the algorithm can tell you how other people are judging your voice further muddies the waters.

"Here the machine has to understand how someone speaks (and what they say) AND infer how someone else understands and interprets these words. This is an even more complex task because you have to read two minds. An algorithm as a mediator or interpreter seems very odd, I doubt that a system (at least at this point) is able to crack this complex social code," she said.

Mozilla fellow Frederike Kaltheuner agreed that voice analysis has inherent limitations. Voice recognition systems have also historically struggled with different kinds of voices, she said. "Accuracy is typically lower for people who speak with an accent or who are speaking in a second language."

Amazon says Tone isn't a privacy risk. Experts aren't so sure.

Amazon says it has made the Tone feature opt-in for Halo owners. Once you switch it on, it runs in the background, recording short snippets of your voice throughout the day for analysis. There's also an option to turn it on for specific conversations, up to 30 minutes in length.

Amazon says all this data is kept safe and secure, with all the processing done locally on your phone, which then deletes the data. "Tone speech samples are never sent to the cloud, which means nobody ever hears them, and you have full control of your voice data," Majmudar wrote.

Amazon's insistence that human employees won't listen to any of Tone's recordings seems to allude to the time Amazon, along with the other major companies, was caught in a scandal after reports revealed that sensitive Alexa recordings were being sent to human contractors for review.

Halo Tone
Amazon Tone will tell you in the Halo app how it thinks you sound.

But experts say that even without human beings listening to the audio Tone records, there are significant privacy implications.

Privacy policy expert Nakeema Stefflbauer told Business Insider that Halo could be a preamble to Amazon getting into insurance tech. "My first impression is that it's almost as if Amazon is moving as fast as possible to get ahead of public disclosures about its own forays into the insurtech space," said Stefflbauer.

"I am alarmed when I hear about this type of assessment being recorded, because, while I see zero benefit from it, employers definitely might. Insurers definitely might. Public administrators overseeing the issue of benefits (such as for unemployment) definitely might," she added. 

"The ultimate sign to me that you as the customer aren't the ultimate target of the data collected is that Amazon already has partnerships with insurers like John Hancock and medical records companies like Cerner," Stefflbauer added.

John Hancock announced Thursday it would be the first life insurer to integrate with Amazon Halo. "Starting this fall, all John Hancock Vitality customers will be able to link the Amazon Halo Band to the program to earn Vitality Points for the small, everyday steps they take to try to live a longer, healthier life," the insurance firm said in a press statement.

Kaltheuner said it's good that the Tone feature is opt-in, but anonymized data from Halo could still be shared in bulk with third parties. "Even if it's in aggregate and anonymous, it might not be something you want your watch to do," she said.

"Our emotions are one of the most intimate and personal aspects of our personality"

Chris Gilliard, an expert on surveillance and privacy at the Digital Pedagogy Lab, told Business Insider he found Amazon's privacy claims unconvincing.

"Amazon felt the heat when it was revealed that actual humans were listening to Alexa recordings, so this is their effort to short circuit that particular critique, but to say that these systems will be 'private' stretches the meaning of that word beyond recognition," he said.

Wachter said that if, as Amazon claims, an algorithm was capable of accurately analyzing the emotion in people's voices, it could pose a potential human rights problem.

"Our thoughts and emotions are protected under human rights law for example the freedom of expression and the right to privacy," said Wachter.

"Our emotions and thoughts are one of the most intimate and personal aspects of our personality. In addition, we are often not able to control our emotions. Our inner thoughts and emotions are at the same time very important to form opinions and express those. This is one of the reasons why human rights law does not allow any intrusion on them.

"Therefore, it is very important that this barrier is not intruded, and that this frontier is respected," she added.

Read the original article on Business Insider


Contributer : Business Insider https://ift.tt/2D93EVU
AI experts doubt Amazon's new Halo wearable can accurately judge the emotion in your voice, and worry about the privacy risks AI experts doubt Amazon's new Halo wearable can accurately judge the emotion in your voice, and worry about the privacy risks Reviewed by mimisabreena on Saturday, August 29, 2020 Rating: 5

No comments:

Sponsor

Powered by Blogger.