Say What? Amazon Alexa And Google Assistant Have Trouble With Non-American Accents

Amazon Echo Google Home
New research into the effectiveness and accuracy of Amazon's Echo and Google's Home smart speakers powered by Alexa and Google Now, respectively, shows that the digital assistants have some difficulty understanding and responding to users with accents. This includes regional dialects, but especially people who speak English as a second language.

The disparity was brought up by The Washington Post, which shared an anecdote involving a Vancouver resident who was raised in Columbia and has a light Spanish accent. When she would ask Alexa to perform menial tasks, the digital assistant would trip over itself. Sometimes it would even perform the complete opposite task, like turning the volume of music up instead of turning it off as requested.

It wasn't just anecdotal accounts that the publication shared, it also teamed up with two research groups to study the accent imbalance of both voice-controlled ecosystems. The tests consisted of "thousands of voice commands dictated by more than 100 people across 20 cities." This was sufficient to uncover "notable disparities" in how the digital assistants understand people living in different parts of the US.

People with regional dialects experienced a bit of trouble with Alexa and Google Now, though not to a huge degree. Users with southern accents, for example, were 3 percent less likely to receive accurate responses from Google Now than those with Western accents. Alexa, meanwhile, was 2 percent less likely to understand commands from someone with a Midwestern accent.

The disparity was much bigger when the digital assistants would listen to people with non-native accents. In one of the studies, Alexa recorded 30 percent more inaccuracies from a test group with heavier accents. And overall, people with Spanish accents were understood 6 percent less often than people who were raised in California or Washington, where Google and Amazon are headquartered.

"These systems are going to work best for white, highly educated, upper-middle-class Americans, probably from the West Coast, because that’s the group that’s had access to the technology from the very beginning," Rachael Tatman, a data scientist who has studied speech recognition and was not involved in the research, told The Washington Post.

The struggle with accents is not for lack of trying to do a better job. Officials from both Amazon and Google noted that understanding accents is one of the key challenges, and that they are pouring resources into that very thing.

"The more we hear voices that follow certain speech patterns or have certain accents, the easier we find it to understand them. For Alexa, this is no different," Amazon said in a statement. "As more people speak to Alexa, and with various accents, Alexa’s understanding will improve."

This is something we expect to see improve sooner than later. It's expected that around 100 million smart speakers will be in circulation by the end of the year, with digital assistants that speak dozens of languages.