Skip to main content

Why voicebots aren’t ready for normal people

Image Credit: Zapp2Photo / Shutterstock.com

The tech elite love voicebots. Google Home is a game-changer, a way to search by voice for recipes when you’re in the kitchen. Amazon Echo and the Alexa voicebot help you order products and check the weather. I’m eager to see how Apple will make the HomePod a must-own product.

And yet there is a long way to go.

Over the past few days, I let several “everyday users” try all of the voicebots in my office. These are folks who do not care that much about technology. They use the same small subset of apps on their phones — pretty much Facebook, Snapchat, and Instagram. They don’t read about tech news and don’t care. If there comes a day when a car drives itself, they will barely notice.

It’s an interesting experience listening to non-techies interact with voicebots.


June 5th: The AI Audit in NYC

Join us next week in NYC to engage with top executive leaders, delving into strategies for auditing AI models to ensure fairness, optimal performance, and ethical compliance across diverse organizations. Secure your attendance for this exclusive invite-only event.


One of my biggest discoveries is that they don’t seem to realize the bot cannot hear them from across the room. Most of us know you have to be in the same room or even nearby. I keep a Google Home, an Amazon Echo, and my iPhone 7 Plus on my desk at all times. From the hallway or the kitchen in my house, I know the bots (and Siri on my phone) can’t really hear me.

But normal users don’t think about that. They think a “talking robot” can hear you just as easily as a human being from the next room, especially if you talk louder. Interference from ambient noise, blockage from a wall — whatever it is, these users don’t care. Bot makers might need to figure this out soon, or I need to buy a few more of these devices to make sure they’re listening in every room.

As you could guess, new voicebot users also tend to experiment. A lot. They ask about trivia, they ask about celebrities. While playing a game called Hot Dice, everyone in the group (none of whom had ever used a voicebot) kept asking about the game rules. Google Home and the Assistant didn’t have a clue. Alexa shrugged. Voicebots don’t have an endless supply of knowledge; they didn’t even know Beyonce had twins. In fact, most voicebots’ knowledge base is really a subset of Wikipedia, and you have to ask the right question. For now, Google searches work better.

In terms of music playback, it’s also interesting. I have Google Music enabled on the Google Home, and the Amazon Echo uses Prime Music. For some reason, this group thought the bot would know their own musical tastes. “Play my favorite song by The Head and the Heart” didn’t work, but the Assistant did play one of their hits. The users felt the bot should identify their voice and know their own Google account and music preferences. Well, not quite yet.

One user kept talking too fast. The basic structure of saying “OK, Google” then waiting a beat for the bot to flash the colored lights and listen wasn’t intuitive to her. Most phrases this user said were too fast and she didn’t wait for the bot to start listening, so it was more frustrating. When there is initial frustration, a consumer will often just move on.

How should voicebots adapt? They will have to learn how people talk, what they really want, how they speak, and provide more information. They really need to stay current with the news. I’m hoping they can do some proactive teaching — “you’re talking a little too fast” would help. The Assistant could have told the user to pause after saying “OK, Google” and could have asked her for her Gmail login. In short, bots will have to become more human if they are going to win over the true non-techies in the world.