Skip to main content

Conversation design is the next big UX challenge for Capital One

Conversational interfaces
Image Credit: Shutterstock.com / kloromanam

Steph Hay, head of conversation design at Capital One, knows that her job — creating conversational interfaces for customers to access account information and complete financial tasks — is counterintuitive. Most people don’t like talking about money with other humans, let alone with a voice-enabled, anthropomorphized computer.

Last year, however, Capital One launched an Alexa skill designed to allow customers to do just that: talk to Amazon’s Alexa in order to track spending, check account balances, and pay bills. I sat down with Hay to discuss the design process behind the Alexa skill and how her team thinks about how people think about money.

More so than most apps, conversational financial products — banking products that users interact with through text or voice — must fight to earn customer trust. “The pure design challenge here is to find the natural language that enables people to trust that we’ve got their back. Because we do,” says Hay.

To achieve that level of trust, Hay’s team stays focused on a few core principles:


June 5th: The AI Audit in NYC

Join us next week in NYC to engage with top executive leaders, delving into strategies for auditing AI models to ensure fairness, optimal performance, and ethical compliance across diverse organizations. Secure your attendance for this exclusive invite-only event.


Be likeable, not sarcastic

“Money relates fundamentally to someone’s upbringing, to feelings of pain and emotion and shame. It can also bring joy and satisfaction and fulfillment,” observes Hay. “Who are we to mess around in a sarcastic way with that array of emotions? It’s important to us to become the trusted, likeable personality you would feel safe talking about your money with.”

To create that likeable personality, Hay’s team starts by drafting statements around a user intent, then immediately listens to how Alexa says them.

Alexa’s inflection, notes Hay, can force the team to “tweak our statements so she doesn’t come across as rude or too succinct.” For example, “when designing the welcome message — trying to decide how she should greet you — we settled on ‘Hi there.’ When we tested just the word ‘Hello,’ she sounded really judge-y.”

Design at the atomic level

In conversational user interfaces (CUIs), there’s less emphasis on designing user flows as most of us understand them, largely because you can’t know in advance what those flows will be. Instead, Hay’s team “constructs the breadth of the potential conversation on an atomic level: Someone’s going to ask about this, and they might ask about this and this. Those potential responses just exist around each other, floating, ready to be served up if the customer asks for them.”

For CUI design, the natural flow of conversation — unique to everyone and learned organically — requires creating a constellation of miniscule building blocks rather than complete flows. “If we took an approach that mimicked standard information architecture — top-down hierarchy, this is the way action should flow logically — we would be wrong so often,” says Hay.

Answer customer questions first

It probably doesn’t sound that difficult to build a better experience than a sprawling voice menu — press 1 if you detest voice menus, press 2 if you simply dislike them, press 0 to wait on hold forever. But figuring out quickly what action a user wants to take poses a major challenge.

Hay describes that challenge this way: “We can know the universe of possibilities around something like ‘Pay Bill.’ That’s a user flow in a web app or mobile app or call center. But it gets complicated quickly on voice. When you say ‘bill’ to Alexa, do you mean credit card bill or a bill from your utility company to your checking account? If you’re trying to pay your credit card bill, do you have two cards?”

She adds, “We can’t even be in a conversation until we know what you want.” The goal is to design conversations that “answer customers’ questions first and ask clarifying questions second. We’re not building another robotic choose-your-own-adventure voice system that requires many inputs from you before we can give you an answer. Nobody wants to be in that kind of conversation.”

Lead with humanity

“A massive differentiator for companies like ours that interact with millions of customers daily,” says Hay, “is the ability to react in a way that emphasizes their situation: Are they standing in line at the grocery store, are they attempting to maintain a budget for holiday shopping season, are they trying to pay off student loans? All of those life contexts affect how you’re feeling emotionally when you ask questions about money.”

But it’s still early days for conversation design. Acknowledging the learning curve for the humans behind the bots is nearly as crucial as accurately predicting a user’s situation. “If we say, ‘I’m really sorry, I don’t know how to answer that question, but the more you talk to me, the more I learn,’ people respond with empathy towards our product. They help us get it to where it needs to be. If we just say, ‘I’m sorry, I can’t help you with that’ and leave them because we’re not functional enough yet, we’ve missed the humanity.”

Above all, says Hay, to solve these problems and connect emotionally with users, “we have to lead with humanity.”

Capital One’s emphasis on humanity in conversation design inspired the upcoming Humanity.ai, a February 28 people-and-bots event in San Francisco sponsored by Capital One Design.

Jessica Collier is design partner at All Turtles, an AI startup studio, where she works on conversational products.