-
Advertisement
Artificial intelligence
LifestyleGadgets

Can Amazon Halo tell your emotions by your tone of voice? Experts doubt it. Others worry about the privacy issue

  • Amazon’s new wearable includes a Tone feature that is supposed to tell your emotional state by the tone of your voice
  • Critics believe human emotions and communications are too complex for an algorithm to decipher; other AI experts wonder what Amazon will do with the information

Reading Time:4 minutes
Why you can trust SCMP
Amazon’s Halo wearable is supposed to be able to judge people’s emotions by their tone of voice; experts argue that human emotions are too complex and subtle for an algorithm to decipher. Photo: Shutterstock
Business Insider

The recently announced Amazon Halo is a wearable band that can track heart rate and sleep patterns, but is also looking to differentiate itself with a peculiar feature: judging your emotional state from your tone of voice.

“Amazon Tone” claims to tell you how you sound to other people. It uses “machine learning to analyse energy and positivity in a customer’s voice so they can better understand how they may sound to others, helping improve their communication and relationships”, according to Amazon.

To give an example, Amazon’s chief medical officer, Maulik Majmudar, said Tone might give you feedback such as: “In the morning you sounded calm, delighted, and warm”. According to Majmudar, Tone analyses vocal qualities like your “pitch, intensity, tempo, and rhythm” to tell you how it thinks you sound to other people.

Advertisement

But some experts are dubious that an algorithm could accurately analyse something as complex as human emotion – and they are also worried that Tone data could end up with third parties.

Dr Sandra Wachter, associate professor in AI ethics at the University of Oxford. Photo: University of Oxford
Dr Sandra Wachter, associate professor in AI ethics at the University of Oxford. Photo: University of Oxford
Advertisement

“I have my doubts that current technology is able to decipher the very complex human code of communication and the inner workings of emotion,” says Dr Sandra Wachter, associate professor in AI ethics at the University of Oxford.

Advertisement
Select Voice
Select Speed
1.00x