Botler.ai, the artificial intelligence startup that launched an immigration chatbot in the wake of Trump’s travel ban, has turned their attention to helping victims of sexual harassment in the wake of Harvey Weinstein.
Since the New York Times expose that brought to light years of hidden sexual harassment by Mr. Weinstein, victims in nearly every sector and industry have gathered the courage to come forward and report their own experiences. The tech world was rocked by its own scandals, including the revelations about David McClure that led to the shuttering of 500 Startups Canada.
Today, Botler released a system that helps empower victims of sexual harassment by finding and improving their chances of successfully taking legal action against their abusers.
Since their immigration bot launched the startup has switched tacks, working on a more general system that can learn many different areas of the law to help with a wide variety of legal problems.
“We’ve been working on this [sexual harassment iteration] since a few weeks ago, after the new wave of allegations started coming out,” Amir Moravej, the founder of Botler, told MTLinTECH.
When a user logs on, they are presented with two conversational directions: do they wish to discuss a sexual harassment issue, or a different legal issue? If they respond to the first option, the system walks them through a series of structured questions to understand the situation a bit better and set up the scenario. After that, it asks the user to provide a detailed report of the incident or incidents that took place.
“What Botler does after is it takes the entire set of information the user provided, and it cross-references it against all the different relevant sexual harassment laws, either in the U.S. or Canada depending on the location of the user,” Co-founder Ritika Dutt told MTLinTECH. “It tracks the information against the system to determine the possible violations, then sends an email to the user afterwards which has a couple different documents attached as PDFs.”
One attachment is the list of violations that may have taken place, with an explanation of the specific laws. The second is an incident report. This is based on the report the user filled out earlier, but it’s organized and formatted so that if they’d like to pursue their case further with the relevant authorities, whether the police, their HR department, or a university administration.
“We trained the system both on the criminal codes and all the publicly available cases we could find,” said Moravej.
The AI used deep learning to read through more than 300,000 US and Canadian criminal court documents, including sexual harassment related complaints.
Those not looking for help specifically related to sexual harassment are still encouraged to visit the site. If a user chooses the second option at the start of the conversation, they can sign up and say what they do want information on, whether it’s tax, custody, or incorporation law.
“This is just the first step for us. We would love to see what other legal problems users would like us to focus on. We’re really open to capture as many problems as possible.”
This is also just the first step to help victims of sexual harassment. Botler plans to release a tool that helps users compile their cases and, should they want legal representation, will connect them with a trusted lawyer.
Connecting with a real flesh-and-blood lawyer is still key, as AI’s judgment should not be considered legal advice.
“The regulations are very clear on this. If you haven’t passed the bar, you cannot give any sort of legal advice,” stresses Moravej. “Though, we’re researching if there is a way for Botler AI to register and write the bar exam.”
“AI can help provide affordable or even free legal services to those in need,” said Dr. Bengio, the scientific director of the Montreal Institute for Learning Algorithms (MILA). “Today, Botler’s team took another step toward this. With the right investment in their research, we can see the future of justice improve for the better while demonstrating the capabilities of AI for positive social impact, something that is aligned with MILA’s mission.”
Photos by Eva Blue and Meng Jia