MANAGEMENT
image

Artificial intelligence

How to plan ahead for the inevitable voice-activated future

For example, before bringing your voice-controlled service to consumers, try it out with your employees and develop a staff enhancement version

PUBLISHED : Friday, 28 July, 2017, 4:48pm
UPDATED : Friday, 28 July, 2017, 10:30pm

More and more companies are turning to voice-controlled services as a way to “humanise” their automated offerings.

But as we all know, early forays into this technology have not always led to happy customers.

When the device does not understand customers’ accents or fails to meet expectations of simulating a conversation, people are left desperate to reach a human being, and often tell tales of wanting to throw their computer or phone across the room. That is never good for brand loyalty.

Companies need to rethink their mission – it is not about replacing human conversations, but simplifying interactions with products and services.

Rather than imitating human conversations, they should be focused on relevant scenarios that make people’s lives easier and improve human-to-machine interactions.

Our research studies and experience designing voice services at Fjord have uncovered a key stumbling block: the way people speak to voice-enabled devices is part of the problem.

The truth is, we do not speak to them in the same way they speak back to us. A critical factor in designing for voice is understanding more about the intent of the person interacting and the limitations of the device itself.

Adhering to a few key principals will help companies improve the human-to-machine interaction with voice-controlled devices:

Ask the right questions:

Who is the right person (or voice) for your audience? Is it male or female? In Hong Kong, Putonghua, Cantonese and English all need to be available – but can the system understand accents in each of these languages?

Which questions can the device answer and how do we guide users to those questions?

Given the desire to create two-way dialogue, how do you design for both the input and output of the device? How does the device know when to listen and when to speak? Are there certain spoken commands that will signal this shift, or can it be conveyed through things like timing and tone?

Consider what voice your audience might expect to hear. An unexplored area in voice design is using familiar expert voices of real people answering questions within their area of expertise.

The whole point of using AI-voice systems in a business setting is to streamline processes – make customers lives easier and employees more efficient

To help make sense of the current and future state of voice interfaces and navigate what is destined to be an increasingly “Open Sesame!” age, we’ve compiled some of the most important things we’ve learned about voice, which you can find here.

The whole point of using AI-voice systems in a business setting is to streamline processes – make customers lives easier and employees more efficient. That means designing for human hands-off – the voice service needs to work so that it can easily connect back to a person on staff in your organisation. It also has to be easier for the customer to get a fast solution and/or connect to a real person.

All of this means you need to practise, practise, practise before unleashing it on the outside world.

Before bringing your voice-controlled service to consumers, try it out with your employees and develop a staff enhancement version.

This will give you an opportunity to see how the service assists your employees in their day-to-day activities and pinpoint any miscommunication that needs to be addressed.

By trying it out internally, it also demonstrates to employees which kind of questions the service can answer, which helps establish more realistic expectations.

This can also be fun for your staff. Let them ask the silly questions they get asked. Let them come up with the wild and whacky and craziest questions they can imagine.

Let them then devise the answers. Let them blow off some steam – but also ensure they feel vested in the system that is designed to help them.

Voice-activated AI systems are the wave of the future but we have got to think through how they want to experience it now.

Inaki Amate is managing director of Hong Kong; John Jones is design strategy lead at Fjord, design and innovation from Accenture Interactive

business-article-page