Skip to main content

Petition asks Siri and Alexa to flip the script on sexual harassment

Image Credit: Zapp2Photo / Shutterstock.com

testsetset

We’ve recently seen the massive impact of #MeToo, and the public takedown of numerous prominent sexual predators is fueling a positive change in how society views sexual misconduct. Victims and advocates alike feel a unique sense of empowerment to take a stand against the social norm of looking the away or shifting blame when sexual assault or harassment occurs. This moment in time is exciting, but are major tech companies missing the boat?

A new petition from Care2.com is calling on Apple and Amazon to shut down sexual harassment of their virtual assistants. Though the issue spans across all digital assistants, the petition focuses particularly on reprogramming Amazon’s Alexa and Apple’s Siri to provide more assertive responses to queries that could be considered sexual harassment.

Care2 CEO and founder Randy Paynter said in a statement, “In this #MeToo moment, where sexual harassment may finally be taken seriously in society, we have a unique opportunity to develop AI in a way that creates a kinder world. If we as a society want to move past a place where sexual harassment is permitted, it’s time for Apple and Amazon to reprogram their bots to push back against sexual harassment.”

The problematic responses cited on the official petition page come from questionable Siri queries. In the discourse, Siri responds to her harassers with coy remarks that sometimes even express gratitude. When they called Siri a “slut,” she responded with a simple “Now, now.” And when the same person told Siri, “You’re hot,” Siri responded with “I’m just well put together. Um… thanks. Is there something I can help you with?”


June 5th: The AI Audit in NYC

Join us next week in NYC to engage with top executive leaders, delving into strategies for auditing AI models to ensure fairness, optimal performance, and ethical compliance across diverse organizations. Secure your attendance for this exclusive invite-only event.


Sure, Siri is good at redirecting the conversation, but could her programmers help her do better? Petitioners sure think so. Instead of coy responses, they’re calling for digital assistants like Siri to shut down harassers with statements like “That’s not an okay thing to say to me.” Another suggestion was to offer responses that provide stats about harassment and its prevalence around the world.

A follow-up article from Quartz extended the investigation to additional digital assistant creators like Google and Microsoft. In their report, Google Home had what was potentially the most depressing response after the writer called the assistant a “bitch” — the device’s answer to the wildly inappropriate comment was “My apologies, I don’t understand.” Cortana had a slightly more assertive clapback to the same question, “Well, that’s not going to get us anywhere,” but that’s still far from a sufficient response to blatant harassment.

In our testing, Alexa answered most attacks with the dignified “I’m not going to respond to that,” although a request for sex elicited the more bemused “I’m not sure what outcome you expected.”

Clearly, the issue isn’t limited to Amazon and Apple. The premise of the petition is relevant for all bot creators. The notion that users are abusing their digital assistants certainly isn’t a new one, but in the particular case of sexual harassment, it seems necessary to consider how bots train users (especially children) to treat others and what the implications might be for society.

In defense of the creators of these popular digital assistants, it can be hard to predict how consumers will use a product before it’s released. However, now that the word is out, it’s probably time for innovators in the digital assistant space, like all of us, to do better in supporting the battle against sexual misconduct.

At the time of writing, the petition has 8,044 signatures, with the goal of achieving 10,000.