Advertisement
Advertisement
Artificial intelligence
Get more with myNEWS
A personalised news feed of stories that matter to you
Learn more
Amazon’s Halo wearable is supposed to be able to judge people’s emotions by their tone of voice; experts argue that human emotions are too complex and subtle for an algorithm to decipher. Photo: Shutterstock

Can Amazon Halo tell your emotions by your tone of voice? Experts doubt it. Others worry about the privacy issue

  • Amazon’s new wearable includes a Tone feature that is supposed to tell your emotional state by the tone of your voice
  • Critics believe human emotions and communications are too complex for an algorithm to decipher; other AI experts wonder what Amazon will do with the information

The recently announced Amazon Halo is a wearable band that can track heart rate and sleep patterns, but is also looking to differentiate itself with a peculiar feature: judging your emotional state from your tone of voice.

“Amazon Tone” claims to tell you how you sound to other people. It uses “machine learning to analyse energy and positivity in a customer’s voice so they can better understand how they may sound to others, helping improve their communication and relationships”, according to Amazon.

To give an example, Amazon’s chief medical officer, Maulik Majmudar, said Tone might give you feedback such as: “In the morning you sounded calm, delighted, and warm”. According to Majmudar, Tone analyses vocal qualities like your “pitch, intensity, tempo, and rhythm” to tell you how it thinks you sound to other people.

But some experts are dubious that an algorithm could accurately analyse something as complex as human emotion – and they are also worried that Tone data could end up with third parties.

Dr Sandra Wachter, associate professor in AI ethics at the University of Oxford. Photo: University of Oxford

“I have my doubts that current technology is able to decipher the very complex human code of communication and the inner workings of emotion,” says Dr Sandra Wachter, associate professor in AI ethics at the University of Oxford.

“How we use our voice and language is greatly impacted by social expectation, culture and customs. Expecting an algorithm to be able to read and understand all of those subtleties seems more like an aspirational endeavour,” she says.

A Covid-19 detector on your wrist? That’s what Fitbit is claiming

Wachter added that claiming the algorithm can tell you how other people are judging your voice further muddies the waters.

“Here the machine has to understand how someone speaks [and what they say] and infer how someone else understands and interprets these words. This is an even more complex task because you have to read two minds,” she explains.

“An algorithm as a mediator or interpreter seems very odd. I doubt that a system [at least at this point] is able to crack this complex social code.”

Amazon’s chief medical officer, Maulik Majmudar. Photo: Twitter/@mdmajmudar

Frederike Kaltheuner from software non-profit Mozilla agreed that voice analysis has inherent limitations. Voice recognition systems have also historically struggled with different kinds of voices, she says. “Accuracy is typically lower for people who speak with an accent or who are speaking in a second language.”

Amazon has made the Tone feature optional for Halo owners. Once you switch it on, it runs in the background, recording short snippets of your voice throughout the day for analysis. There’s also an option to turn it on for specific conversations up to 30 minutes in length.

Amazon says all this data is kept safe and secure, with all the processing done locally on your phone, which then deletes the data.

How we use our voice and language is greatly impacted by social expectation, culture and customs. Expecting an algorithm to be able to read and understand all of those subtleties seems more like an aspirational endeavour
Dr Sandra Wachter, associate professor in AI ethics at the University of Oxford

“Tone speech samples are never sent to the cloud, which means nobody ever hears them, and you have full control of your voice data,” Majmudar wrote.

Amazon’s insistence that human employees won’t listen to any of Tone’s recordings seems to allude to the time Amazon, along with the other major companies, was caught in a scandal after reports revealed that sensitive Alexa recordings were being sent to human contractors for review.

But experts say that even without human beings listening to the audio Tone records, there are significant privacy implications. Privacy policy expert Nakeema Stefflbauer says that Halo could be a preamble to Amazon getting into insurance tech.

Amazon Halo can track the intensity of your workout but can it read your emotions? Photo: Amazon

“My first impression is that it’s almost as if Amazon is moving as fast as possible to get ahead of public disclosures about its own forays into the ‘insurtech’ space,” says Stefflbauer.

“I am alarmed when I hear about this type of assessment being recorded, because, while I see zero benefit from it, employers definitely might. Insurers definitely might. Public administrators overseeing the issue of benefits [such as for unemployment] definitely might,” she adds. 

Kaltheuner says it’s good that the Tone feature is opt-in, but anonymised data from Halo could still be shared in bulk with third parties. “Even if it’s in aggregate and anonymous, it might not be something you want your watch to do,” she said.

Chris Gilliard is an expert on surveillance and privacy at the Digital Pedagogy Lab. Photo: Digital Pedagogy Lab

Chris Gilliard, an expert on surveillance and privacy at the Digital Pedagogy Lab, says he found Amazon’s privacy claims unconvincing.

“Amazon felt the heat when it was revealed that actual humans were listening to Alexa recordings, so this is their effort to short circuit that particular critique, but to say that these systems will be ‘private’ stretches the meaning of that word beyond recognition,” he says.

Wachter says that if, as Amazon claims, an algorithm is capable of accurately analysing the emotion in people’s voices, it could pose a potential human rights problem.

“Our thoughts and emotions are protected under human rights law, for example the freedom of expression and the right to privacy,” says Wachter.

“Our emotions and thoughts are one of the most intimate and personal aspects of our personality. In addition, we are often not able to control our emotions. Our inner thoughts and emotions are at the same time very important to form opinions and express those. This is one of the reasons why human rights law does not allow any intrusion on them.

“Therefore, it is very important that this barrier is not intruded, and that this frontier is respected,” she adds.

Post