Soon your smartphone will know what you're doing and predict your next move
Devices that know what you're doing and can predict your next move are coming our way, writes Jamie Carter
Smartphones know where we are and even what direction we're facing, but what if your smartphone knew your context? Not just where you are, but what you're doing, who you're with - and what you're (probably) going to do next. Doesn't that sound like a better personal assistant than your phone has now?
The future of phones is all about contextual computing, which should be with us by 2017 in full force, embedded in phones, tablets and all kinds of wearable devices. It's a revolution that will be built around computers and devices that can both sense and react to their environment; the goal is "people-centred design".
One early effort at contextual computing comes from an Android app called Agent. It utilises the sensors already in your smartphone - GPS, gyroscope, accelerometer, Bluetooth, temperature and Wi-fi - to assess your context. Then it takes actions on your behalf.
Agent can detect when you're low on battery and start preserving power. It also knows when you're sleeping and automatically silences your phone, and can tell when you're driving, too. In the latter scenario, it automatically reads aloud your text messages.
Why are contextually aware apps like Agent now possible? "It's a combination of Moore's Law, which is shrinking the size of computing devices, along with the mass-market penetration of mobile devices, which means that it is now feasible to try and attempt to learn context," says Kulveer Taggar, CEO at Agent.
"We have abundant computing power, along with lots of sensors, which means that smartphones can look at many inputs and start learning what we are doing, where we are and who we're with."
Taggar thinks that next-generation smartphone apps will soon be able to learn about how you live your life and be able to perform highly personalised services, such as alerting you when someone really important to you is suddenly nearby.
"Expect something crazy," says the Silicon Valley-based Taggar of the next generation of apps that will use contextual computing. "The phone will know if you're at a party based on the ambient noise it can hear, and will start vibrating rather than ringing if it's in your pocket if there's an emergency."
Agent isn't the only app that is context-aware; Cover Lock Screen lets you customise what apps you see on your smartphone's screen, and when.
It's just been bought by Twitter to make mobile micro-messaging more context-aware. Apps like TripIt, which search your e-mails and collect and curate anything it finds about travel - tickets, hotel bookings, boarding passes - are impressive, but merely the first step in creating convenience from context. Next up will be apps that show you your electronic boarding pass if you check your phone while at the airport. Ditto for hotel reservations.
All of these apps are slowly being built in to smartphones themselves, but contextual computing at its zenith is about far more than merely gathering data from existing sensors.
"Adaptation is an essential element of a context-aware system as the applications need to adapt to their surroundings and to the users," says computer science expert Kevin Curran, a senior member at the New York-based Institute of Electrical and Electronics Engineers.
He defines context as being based on the user's location, environment and orientation, as well as the emotional state, focus on attention, date and time, and people and objects in the user's environment. That's a lot of variables, although it's clear that context-aware devices will go deeper than apps like Agent and Cover Lock Screen do now.
"Contextual computing is different from the simple sensor-based applications seen on smartphones today," says Curran. "Instead of you having to go and search for hotels, the device would already know what kind of hotel you are looking for by using the information gathered on hotels you have picked in the past and suggest hotels nearby based on those preferences," he says.
Convenience is at the core of all context-aware concepts, but a user's location and previous preferences are a good place to start if you want to second-guess their intentions.
A device that knows where you are and remembers what you like could track your activity to give you tips on health and fitness (handy for lifestyle-dependent diseases such as diabetes or heart disease), or find you a restaurant nearby that serves your favourite food.
Nor does it have to be a smartphone; the TV remote could identify the person holding it and automatically switch the channel to their favourite television show.
But context is about much more than two variables. "Context relates to both human factors and physical environment factors," says Curran, who thinks it's about defining the user - their habits, interests, current emotional state, engaged tasks, general goals and spontaneous activity.
Add information about who's nearby, some group dynamics, social interactions and the location of others in a room and you've got quite a calculation going on. The physical factors - such as measurements of the user's absolute position, their relative position to others, light levels, air pressure and ambient noise - is the easy bit.
Our smartphones aren't suddenly going to become kings of context overnight, but they're headed that way. The Siri and Google Now personal assistants in Apple and Android devices, respectively, are about accessing context in a hands-free way. Google Now already anticipates follow-up questions.
Google has deeper plans in context-awareness; its Moto X handset launched last year has Google Now more closely integrated into the core of the device, but it also includes a voice-recognition chipset built into the processor that so far has no use.
That's a big clue that experimentation is going on behind closed doors. Google also has its Project Tango, a prototype phone that can map your surroundings in 3-D.
Microsoft and Apple aren't far behind. The former has a Bing-powered new voice-activated personal assistant on its forthcoming Windows Phone 8.1 operating system called Cortana (named after the artificially intelligent character in the Halo video game).
As well as promising natural conversation, Cortana is designed to remember what you've asked about in the past and make suggestions.
But to get some real context, Cortana will ask you to input your personal interests, then cross-reference every subsequent request with your calendar, the content of your e-mails, and your search habits on Bing.com
The next step is activity recognition, although that will depend on more than mere smartphone development.
"A context-aware phone may know that it's in the meeting room, and that the user has sat down, and therefore reject any unimportant calls," says Curran. "The goal is to enable computers to have similar capabilities as humans for recognising people's activities."
The first wave of contextual computing is likely to be about relaxing and switching off, and letting smartphones take the strain. "Phones will save you brain-cycles throughout the day by automating repetitive tasks, and become really smart at not disturbing you when you're busy," says Taggar.
Like all great technology, it gets out of the way.