Scroll for more

Why doesn’t my voice assistant understand my accent?

Ever had a conversation with a voice assistant and found it’s been unable to understand your accent? If so, you’re not alone. In the UK, for instance, people with regional dialects find it hard to speak to audio devices like Siri and Alexa. According to a study, 79% of people with regional accents alter their voice to make sure they are understood by their digital assistants.  

Voice assistants have risen in popularity in recent years, with 2 in 5 adults using voice search daily. Users in other countries are struggling to hold a conversation with voice technology as well, which indicates there needs to be an improvement in the way voice devices comprehend different languages.

From a B2B marketing and sales perspective, language comprehension is important for targeting customers around the world. For example, if a brand is going to optimise their products for voice search and target consumers in China, then they need to account for how voice assistants will hear the dialect.

Training machines to understand

Artificial intelligence software is trained to recognise speech through audio samples. Engineers collect hundreds of voices that are talking about a variety of subjects. Then, they are transcribed into audio clips and machines learn how to recognise patterns between words and sound. So, researchers will need to introduce audio samples that deviate from traditional English dialects for voice technology to improve.

Companies are looking into ways to enhance speech recognition, such as UK-based Trint. An automated speech-to-text platform, Trint offers quick transcribing for over a dozen dialects, which includes all English variations, Polish, Swedish, French and Dutch. Baidu, a Chinese search engine company, is developing a ‘deep speech’ algorithm that is meant to recognise different accents and could be adopted by voice technology in the future.

Bridging the accent gap

Despite issues with speech recognition, voice assistants like Siri have expanded to include other accents. Siri offers Australian, British, Irish, South African and American dialects, while Google Assistant launched an Australian option in 2018. Alexa also recognises languages such as English, German, French and Japanese.

However, there is still a long way to go. All kinds of accents have been misinterpreted by voice assistants. There’s a well-known video of a woman with a thick Scottish brogue asking Alexa to play a song, only to be repeatedly ignored. For voice devices to improve their language capabilities, they need to be trained on bigger data sets that are adapted for different regions.

Future interactions between humans and voice assistants are likely to morph into everyday conversation. As voice recognition software improves, accents should become less of an issue. We already change the way we talk at different times to adapt to certain situations, such as enunciating during an interview or speaking slowly over the phone.

Rather than feeling put off by voice assistants not understanding our accents, we should embrace the challenge of making them understand so the barriers of miscommunication are broken down. It will further a technology that will continue to go from strength to strength.

At BDB, we are dedicated to keeping businesses up to date on the latest B2B marketing trends. Be sure to download the BDB guide to voice search and discover more about this rapidly growing technology.

  • This field is for validation purposes and should be left unchanged.