Watch all the Transform 2020 sessions on-demand here.
Montreal-based Botler.ai today announced that it will launch a new service to help victims of sexual harassment determine whether they have been violated and whether what happened to them fits U.S. or Canadian criminal code.
The bot was created following harassment experienced by Botler.ai cofounder Ritika Dutt.
As people navigate the ramifications of the Harvey Weinstein abuse allegations and the #MeToo movement — with women sharing tales of sexual harassment and worse — a recurring theme is that what’s happening to celebrity actresses is also happening to everyday people across society.
Though Dutt second-guessed herself following the harrassment and felt unsure at times whether what had happened to her constituted a crime, she said her doubts lifted when she checked the relevant legal code.
June 5th: The AI Audit in NYC
Join us next week in NYC to engage with top executive leaders, delving into strategies for auditing AI models to ensure fairness, optimal performance, and ethical compliance across diverse organizations. Secure your attendance for this exclusive invite-only event.
“In my case, it wasn’t just me making things up in my head. There was a legal basis for the things I was feeling, and I was justified in feeling uncomfortable,” she said.
According to the U.S. Equal Employment Opportunity Commission, 75 percent of sexual harassment incidents in the workplace go unreported, and the majority of individuals who report incidents experience retaliation of some kind.
“One of the most important things about sexual harassment is that the majority of cases go unreported,” Dutt said.
The bot uses natural language processing to determine whether an incident could be classified as sexual harassment or another form of sexual crime. The NLP was created using court documents from 300,000 court cases in Canada and the United States, Dutt said. It drew largely on testimonies from court filings, since testimonies align most closely with a conversational tone.
While many forms of AI focus on predictions or interpreting intent, Botler’s new conversational AI focuses on giving its users an understanding of whether what they experienced was a violation of U.S. criminal code or Canadian law. The goal isn’t to tell people whether they could win a case in court, but rather to empower women with confidence grounded in legal doctrine. If a user of the bot wants to report an incident to an appropriate authority, be it police or an HR person at work, the bot can generate an incident report.
The sexual harassment detection bot has not been tested on a wide audience, and though law students participated in constructing the bot, no one who had passed the bar exam in either the U.S. or Canada helped compile Botler.ai’s new offering, Dutt told VentureBeat in an email.
This is Botler’s second creation, following a bot made last year to help people navigate the Canadian immigration system. Ideas shared during conversations with the bot will guide Botler’s next steps, Botler cofounder Amir Moravej said.
In its mission to deliver reasonably priced legal services through a bot, Botler.ai has a lot in common with startups like Visabot and DoNotPay, who offer easy-to-use services for filing for a green card and disputing parking tickets, respectively.
In July, DoNotPay expanded to provide more than 1,000 legal services to residents of the U.S. and U.K.
But Botler and DoNotPay are different, Moravej said, because Botler doesn’t hate lawyers.
“We don’t think necessarily that lawyers are the problem, we think that the system is complicated, and we think that even for the best lawyers, the system is still very complicated. We want to make it efficient and easy and comprehensible to anyone,” Moravej said.
Botler has six employees and is based in Montreal, Canada. Yoshua Bengio, director of the Montreal Institute for Leaning Algorithms, acts as an advisor to the startup.