Advertisement
Advertisement
Elliptic Labs’ technology recognises your hand gestures and uses them to move objects on a screen, similarly to the way bats use echolocation to navigate. Photo: Elliptic Labs

And now for the auto-selfie: how ultrasound is the new frontier for phones, and more

With merely a hand gesture, new ultrasonic application is set to change how we control electronics, from speakers, VR headsets and in-car systems to smartphone voice assistants

It is said that sound is half the experience when watching a movie, but that could also soon apply to using a smartphone.

A new application of ultrasound technology – which Xiaomi adopted last year for its Mi MIX handset – enabled the company to come up with a 6.4-inch Android handset that has almost no bezel on either of its sides or the top and bottom. This tech looks set to change how electronics are conceived, designed and used; and not only phones but speakers, VR headsets and in-car systems could all benefit.

“The trick to making a phone bezel-less is to remove the speaker, move the camera down, and replace the proximity sensor in the top of the device,” says Guenael Strutt, vice-president of product development at Elliptic Labs.

Guenael Strutt, vice-president of product development at Elliptic Labs.
That last component – in the phone essentially to allow the battery-hungry screen to automatically switch off when you bring it up to your ear to make a call – is where Elliptic Labs comes in. Its ultrasonic innovation – nicknamed “Inner Beauty” – is what allows the Mi MIX to have that iconic (almost) edge-to-edge display.

“In the Mi MIX we emit ultrasonic waves, which radiate from behind the screen and echo from the face of the user and into a microphone,” he says. “The data is then processed by the phone.”

The microphone is already there; it’s the one on every phone that’s used for noise cancellation, essentially to remove background noise during a phone call.

That Inner Beauty uses existing hardware and processing power already in phones – and actually cuts down on components – is why it’s becoming a hit with manufacturers. Strutt confirms that we should expect to see more bezel-less phones in 2017 that use Elliptic Labs’ technology.

The future may be edge-to-edge, but beyond mere aesthetics, ultrasound technology is capable of changing how we control electronics.

“Once you’ve made the switch from a narrow infrared beam wave to something that radiates sound all the way around the device, it opens up a whole new world of gestures,” says Strutt.

Initial iterations, like the Xiaomi Mi MIX, only have one speaker and one microphone, but even with such basic hardware some clever innovations are possible.

The Xiaomi Mi Mix.
For example, how many times have you had a notification on your phone, but not been able to read it because the display has gone dark too quickly? Ultrasound could be used to detect your hand near the phone, and keep the screen switched on.

“Another gesture we can easily detect with ultrasound is the taking of a selfie,” says Strutt, explaining that extending an arm out to take a selfie is an easy gesture for ultrasound to recognise. So why not let the phone launch the camera app and automatically press the shutter button?

With more hardware it’s possible to detect a lot more subtlety.

A UK-based company called Ultrahaptics is doing just that with a more sophisticated ultrasound system that creates what it calls “mid-air touch” using ultrasonic air pressure.

“If you bring back touch, it’s much more like operating a real user interface, but it’s flexible and invisible,” says Tom Carter, at Ultrahaptics, of what he believes will change how we interact with electronics.

This is about not just making, but feeling gestures, essentially detecting things that aren’t really there.

“Ultrasound is like normal sound, but at a much higher frequency, so you can’t actually hear it,” he says, explaining that where sound waves overlap they create high pressure points.

“Sound waves are just pressure waves moving in the air, and they can generate enough force to gently push on skin, so your hand can feel something,” he says. “We can sculpt the acoustic field using sound waves to create invisible 3D shapes you can feel.”

Initially, the technology is being used for hands-free control in cars. Using an array of speakers and microphones it’s possible to make sound waves overlap in specific places, and so create the feel of a volume dial, a lever, and even a texture.


Tom Carter, of Ultrahaptics.

A prototype developed for Bosch was shown off earlier this year that allowed the driver to turn a volume dial on an in-car infotainment system.

“We can even create a click sensation, just like the click of a button, so you know you’ve successfully switched something on or off,” says Carter.

“As you drive, you can hold your hand out and get a projection of the volume dial of the music system on your hand, without ever taking your eyes from the road.”

Ultrahaptics’ technology is integrated into a concept car.
The dial even locates the hand of the driver when he holds it in the general area of the music system – thanks to proximity sensors – and then moves subtly to stay ‘stuck’ to that hand until it’s pulled away.

The tech from Ultrahaptics and Elliptic Labs is almost identical, relying on the same physics, and even operating in the same frequency.

However, the scope is different. Ultrahaptics’ method uses hundreds of transmitters and receivers to create objects in 3D. Elliptic Labs uses only one transmitter and one receiver, so can detect only the distance to one object – typically the hands of the phone’s owner.

The future is probably somewhere between the two methods. “In future, tablets and phones will have more microphones and speakers to allow people to speak to them from longer distances,” says Strutt, who confirms that Elliptic Labs is now working with the smart voice or digital assistant industry.

“They already have the hardware we need,” he says of the Amazon Echo-style devices.

Phones and cars may be the initial target of the tech, but both proponents are convinced that ultrasound will become a key technology for voice assistants like Alexa, Siri and Google Home.

“Voice will be one of the main interaction methods of the future, but for now it doesn’t make it easy to choose between things,” says Carter.

Nor is loudly asking Alexa to turn the oven off, dim the lights, or adjust the air con appropriate at, say, a dinner party, or late at night.

The sound of the future we can’t hear.
Cue gesture-driven “haptic feedback interfaces” for home appliances that use ultrasound and other futuristic tech. “I can imagine a future where a 3D voice assistant can pop-up a holographic menu system above the speaker, and you can reach out and touch it for finer interactions,” he says.

Ultrasound-powered proximity sensors could also be used as protection against mistakes made by voice assistants. They may be smart, but if they don’t recognise what you say, they can very often carry out actions you don’t want them to.

“Instead of shouting at your voice assistant about how it’s misunderstood what you just said, you could make a gesture at it, before repeating the command,” says Strutt, who adds that a similar system could be used to prevent a TV from accidentally triggering a voice assistant.

“Maybe you leave your TV on when you go out, or pranksters shout through a window, and when you get home the voice assistant has put the air con on freezing cold and switched on all of the lights,” says Strutt, who wants to enable voice assistants to listen only when its owner is nearby.

“We could use ultrasound-powered proximity sensors to stop your voice assistant from wreaking havoc when you’re not at home.”

This article appeared in the South China Morning Post print edition as: waves of the future
Post