Machines equipped to monitor our emotions will soon be everywhere
Smartphones are beginning to collect and use emotional data on a grand scale. That may not be a bad thing, as long as we humans stay in charge of the process
The age of machines is often painted as a cold, calculating place where harsh efficiency is the only principle. Don't tell that to Pepper, a humanoid robot announced in June that's claimed to be emotionally intelligent. A companion-bot able to hold conversations and know the emotions of who it's communicating with, Pepper can see and interpret frowning, laughing and even analyse your body language.
Pepper is an effort to promote the concept of a human operating system by collecting the most precious data of all - our emotions. Although, for now, Pepper is little more than a gimmick awaiting shoppers in Japanese stores, it's fitting that it has been designed for SoftBank Mobile, one of Japan's largest phone networks, because it is smartphones - not robots - that are beginning to collect and use emotional data on a grand scale.
It started with face-recognition technology, which is now a standard way of unlocking a phone, but the next generation of devices could use sensory technology to gauge and interpret our reaction to everything. There are now vending machines that use face recognition to recommend purchases - and brands would love to know your reaction to watching their advertisements.
"As with face-recognition vending machines, phones can collate data from a wide range of sources … things like who you called, and what photos you've taken - and arguably summarise it for you," says Marcus Mustafa, global head of user experience at marketing and technology agency DigitasLBi.
So what does emotional data do?
"It aims to track a user's emotional reactions - such as joy, delight, surprise, excitement, fear and sadness - to particular external events," says Diana Marian, marketing strategist at London-based Ampersand Mobile, who studied the links between emotions and rationality at New York University.
The emotional data headline-grabber so far this year hasn't been Pepper, but the Apple Watch, which measures the heartbeat of its owner. There's nothing unusual about that - dozens of sports watches have done it for years - but Apple Watch allows you to send your own heartbeat in a message to someone. Frivolous, perhaps, but it's the first step on a journey that could see us all swapping emotional data every day to give our digital interactions some human context.
Could Apple Watch create apps that monitor emotions? "It will certainly add a touch-point for emotions to be monitored," says Mustafa. "Everything that can be measured with the heart rate sensor will add to people's awareness of themselves, and when used with connected headbands and all the other data points, then yes, it will have an impact."
Headbands take us into the complex world of cognitive behaviour and neuro-technology and there are already products aimed at virtual reality gaming.
These headsets are all about reading and using brain signals.
"The notion of focusing your thoughts in the present moment, or mindfulness as some call it, is very much in vogue, so there's definitely more to come," says Mustafa. "Most wearables have the capacity to measure some types of body data and, if shared, will add to the bigger picture."
That picture remains elusive; the collection of emotional data doesn't mean we have the ability to correctly determine someone's mood by gadgets alone. All we have is hints.
"Our phones could, in principle, collect data about expressed emotions," says Marian. "This would not be data about actual emotional states, but about emotional expressions."
You might think that the expression on someone's face is a dead giveaway to what mood they're in, but even that makes assumptions.
The problem is that there is no such thing as pure emotion. Instead, emotions are a mix of physiological and mental factors. Physiological tell-tales include an increased heart rate, an elevated pulse, brain states, or a change in hormone production. Such things can be measured using sensors by an electroencephalography (EGG) machine or a headset. The mental side is more difficult; this is what a person thinks they experience, and how they interpret their physiological symptoms.
Cameras and phones can already observe facial expressions, but if someone is smiling, are we sure they're happy? Or are they just trying to look happy?
"There is no unambiguous way to map physiological states onto mental states, or onto emotional expressions for reasons that have to do, among other things, with the complexity of individual emotions," says Marian. "No amount of technology will ever be able to get this bit right all the time."
Apple Watch can take one measurement of a person's physiological reaction to a particular event. If that data was correlated to geographical position, or photos taken, or messages sent at that time, it's possible that software could make an educated guess at someone's mood. If the technology behind wearable cameras such as a Google Glass headset or the Narrative Clip - a clip-on device that continuously captures spontaneous images - becomes standard in cameras, there could be some context for emotional data.
A raised pulse alone means nothing. "Any wearables that collect data via skin contact only measure a particular physiological reaction, like an elevated pulse," says Marian. "What we don't know, and undoubtedly we won't ever reliably know, is what emotional state they correspond to. Assigning actual emotions is more a matter of guesswork than proper measurement."
For now, brands and governments use focus groups to get people's likely reaction to products and laws; it's the only way to get good data. What if they could get raw, reliable data instead?
"Commercially, I suspect emotional data will primarily be used to collect people's subconscious behaviour for personalised marketing," says Mustafa. "You can see it happening - it's the missing link between our conscious self and our subconscious behaviour, and therefore highly desirable."
If advertisers and politicians want to know what we really think, they're going to have to get our permission. "It's still about trust and when the customer wants to give their data away," says Mustafa, who thinks that emotional data comes with potential threats to privacy and civil liberties.
"Insurance, lending, employment, or any targeting of vulnerable people are all potentially dangerous areas, but there could also be massive benefits if people can stay in control of what they share and with whom."