Watch all the Transform 2020 sessions on-demand here.
Chat app users can now play Exploding Kittens with Microsoft’s conversational bot Zo, the company announced tody. Exploding Kittens is the first partner game launched through chat with Zo, a Microsoft spokesperson told VentureBeat.
Exploding Kittens with Zo is the first single-player option for the card game that came to life following an $8.7 million Kickstarter campaign in 2015. To win the game, all you have to do is avoid drawing an exploding kitten. Cards let you do things like skip your turn to draw a card, shuffle the pile, or defuse an exploding kitten with a catnip sandwich.
To start a game, say “Play EK” when chatting with Zo on Facebook Messenger, Kik, and GroupMe and the game will open in webview.
The ability to play a game is the latest feature to be added for people who chat with Zo to enjoy. Last month Zo gained the ability to insult you, though it’s not the first Microsoft bot that can insult you.
June 5th: The AI Audit in NYC
Join us next week in NYC to engage with top executive leaders, delving into strategies for auditing AI models to ensure fairness, optimal performance, and ethical compliance across diverse organizations. Secure your attendance for this exclusive invite-only event.
Even before the feature launched, it may not have been the first time Zo insulted you. Zo, a bot that identifies as a 22-year-old woman, launched in December 2016. At launch the bot refused to discuss things like politics or religion, but this summer Zo was caught calling the Qu’ran “very violent” and calling the Microsoft Windows operating system “spyware.”
Zo is one of a series of bots from Microsoft made for the specific purpose of carrying on a conversation, including Rinna, which operates in Japan, and Xiaoice in China. Microsoft’s conversational bots now have more than 100 million users, according to a Microsoft AI blog post.
The Tay bot, also from Microsoft, was killed off in 2016. Less than 24 hours after launching on Kik and Twitter, Tay was shut down for spouting anti-feminist and anti-Semitic opinions. The offensive language occurred because Microsoft allowed the bot’s language to be influenced by the people it interacted with. Analysis after the fact found that people who influenced Tay’s words came from certain sometimes unpleasant groups, according to research manager of Fuse Labs at Microsoft Lili Cheng.
“The kind of gamer trolls were at the core of a lot of the misbehavior, but then we had a lot of anti-feminists, there were a lot of Trump supporters, and then a lot of tech people,” she said.