The rise of voice in chatbots
If you regularly interact with brands, companies or messenger apps – whether as a customer or a developer – then you’ll have encountered a chatbot. Over the past few years, they’ve become an increasingly sophisticated means of customer service and interaction, whether you’re booking tickets, shopping for clothes or trying to make a complaint.

They’re more limited than human interaction, but the integration of voice search into chatbots is set to be the next great leap forward. Are we on the verge of a revolution in how we interact and get information online and, if so, what should developers be considering in this brave new world? How do we make chatbots genuinely ‘chatty’? Let’s have a look.

A brief history of chatbots
In one form or another, chatbots have been around for decades. Early efforts to get robots to mimic human speech and converse with us were trialled way back in 1966. While subsequent robots got better at parodying the ebb and flow of real human speech, you could still see the joins, and the limitations were obvious.

Phone chatbots were the next level, although you may have been frustrated at calling up a bank and it failing to understand you spelling out a name or card number. Now we live in the age of Alexa, Siri, Google Now, and bots are integrated into many of the websites and apps we use. Done well, we can barely tell we’re not talking to a real person.

Chatbot development
People aren’t developing chatbots just because they can, but for a spectrum of reasons. For starters, brand marketers have realised you don’t want to be spammed with emails, or to download their app just because you occasionally shop with them. A good chatbot will memorise your buying preferences, making repeat purchases easier, and be able to recommend similar items in a simple chat interface – no bulky app, no endless browsing. It makes a personalised shopping service easier.

Of course, it’s also cheaper for brands to develop a chatbot than to have dozens of people hovering around the online shopfront waiting for you to log on. They’re a 24/7 shop assistant. At their best, they’re a seamless part of shopping or researching, remembering what you like, your size, your payment method. At their worst, they’re like trying to type commands into a 1980s ZX Spectrum text adventure that doesn’t understand what you’re saying and won’t let you move forwards or backwards.

That mixture of personalisation and ‘always on’ functionality means you can also get the service you want out of hours, rendering time zones unimportant and physical barriers obsolete. This allows companies and developers to potentially reach a much wider audience, made even easier when the chatbot is ‘clever’ enough to be aware of cultural and geographical differences.

Overcoming the obstacles
So, chatbots are everywhere, but their text-based, scripted nature comes with built-in limitations. And as we get used to speaking to our devices – whether it’s Siri on our smartphone or Alexa from the armchair – we’ll come to expect the same functionality from brands. Throw Artificial Intelligence into the mix and there’s the potential for voice-centric search to really move things forward.

There are different routes for developers to consider, both in type and voice. Natural Language Processing – or NLP – is one of the major divides. How we talk to voice chatbots should mimic how we talk to humans as closely as possible to make for a frictionless experience. We’re much more likely to use natural language when talking than we do when typing to a messenger chatbot, but the former must be tweaked so it can decipher different accents, volumes and idiosyncrasies of speech. The latter, meanwhile, shouldn’t be sent into a tailspin by a common typo or abbreviation.

When deciding which way to go in developing a chatbot, you should be conscious not just of the remaining limitations, but also of what the user wants. Are they likely to have voice chat or will they need to invest in more tech to speak to your chatbot – a barrier to use? Then again, what if the user has a disability, or if the chatbot will need to be used when the customer is driving, cooking or juggling? Basically, as with all good marketing, you should think about the end user and when they might want to use your chatbot. By adapting to their preferences – in terms of device, in terms of chat or voice, in terms of usability, in terms of how they speak or type – you’ll provide something of more value to them.

At present, type-based chatbots are cheaper and quicker to develop, and the user base is larger. However, as people get more and more used to voice interaction with devices, and advances in AI make that easier and more rewarding, you should keep an eye on opportunities in that area. Ultimately, if you want to speak to your customer, think about how they want to speak to you. For more on the future of chatbots, keep your eyes on our blog for updates.