Skip to main content

The online tools are encrypted, available 24/7 and programmed to be non-judgmental, giving survivors control over how, when and where they disclose

Open this photo in gallery:

Ritika Dutt is a McGill graduate who created Botler AI, an app to help victims report and cross reference their stories of harassment with approximately 300,000 criminal court cases to determine which law has been violated.SARAH MONGEAU-BIRKETT

Every day, the man would show up at the open community space where Ritika Dutt worked in Montreal. He would stare, unnerving her. Somehow, the stranger got her phone number and started texting and calling incessantly. He crept her Facebook page, attending a concert she’d posted about. Once, Dutt peered out her window and saw him standing in the street, looking at her apartment.

Fear bled into her life, but a few friends told her she was overreacting. Dutt struggled to call it what it was: stalking, or criminal harassment in Canadian law.

“I kept making excuses: ‘It’s all in my head,’ or, ‘It wasn’t that bad,’” Dutt, 27, said. “But every morning, I would wake up with anxiety and tension that I have to go and do this again today and every day for the foreseeable future.”

The terrifying, months-long ordeal ended last spring, when Dutt confronted her stalker, blocked him on social media and left her job, traumatized. It wasn’t until the #MeToo movement took hold that something shifted. Dutt felt emboldened. This past December, she launched a free online tool for victims of harassment and sexual assault with the company she co-founded, Botler AI.

The encrypted chat bot helps survivors fill out an anonymous incident report. It then cross references the information with approximately 300,000 criminal court cases to determine which law has been violated. If they choose to, victims can forward the report to authorities.

The service is part of a burgeoning field of technology emerging after #MeToo - neutral, third-party tools that aim to give victims of sexual violence a more positive experience of disclosing.

“The idea is to empower users with confidence grounded in legal doctrine so that they can take control of the situation and pursue it how they see fit,” said Dutt, an economics and political science graduate from McGill University. “I really wish I’d had this kind of information so that I could have taken action at the time.”

The new, online tools are encrypted, available 24/7 and programmed to be non-judgmental. Unlike humans, these bots never ask, “What were you wearing?” or “How many drinks did you have?” The technologies give survivors control over how, when and where they report.

Launched last month, another platform called Spot assists victims of workplace sexual harassment and those who witness it. Using a messaging-style chat, Spot asks users questions about the incident, recording the material as a time-stamped, securely signed PDF. Victims can keep or submit their encrypted reports, anonymously or not, to HR or employers. Alternately, Spot can send it in on their behalf. Banks, corrections, NGOs and pharmaceutical companies have approached Spot’s creators to offer the tool in-house to their employees.

Spot was designed based on the “cognitive interview” technique, a method that investigators and psychologists use to extract the highest quality memories from victims who have suffered through emotional events.

Open this photo in gallery:

Spot assists victims of workplace sexual harassment and those who witness it.Handout

“There are things that AI is better at than people,” said Julia Shaw, a memory scientist at University College London who co-founded Spot. “This is possibly something that HR departments were never good at. It’s not necessarily their fault. It’s just that people are scared of talking to human beings about this. Now, with AI, we can find a way to start those conversations without the intimidation.”

Unlike people, these online tools don't victim blame. “Humans are so biased, implicitly and explicitly,” Shaw said. “[Spot] is supposed to be an interviewer who you can talk to without the baggage that comes with talking to a human being.”

When users talk to Botler AI’s chat bot, it responds by countering rape myths. “An existing relationship by itself does not mean that you consented to the encounter,” reads one message. “The way in which you are dressed NEVER constitutes consent.”

Beyond helping victims make sense of the legal landscape, Botler AI is also programmed to appear sensitive. “My only priority is to make sure that you will be OK,” assures the chat bot, whose avatar is styled as a retro Jetsons robot with a bow tie. Dutt says victims have told her they find its tone comforting. “Obviously, it’s a bot and they know it’s a bot, but the language and the framing of the questions was non-threatening. It seemed almost friendly.”

The bots’ automated words sound remarkably empathetic, certainly. But is it a sorry state of affairs that sexual-assault victims would now rather turn to AI than face a human interlocutor? Sherry Turkle, author of the 2010 book Alone Together: Why We Expect More from Technology and Less from Each Other, raises concern on this front.

“It is always easier to talk to a non-human thing because we can hide everything emotional and personal and meaningful,” said Turkle, who is founding director of the MIT Initiative on Technology and Self. “Let’s use technology in a sensible way but remember that our human goal is to get people who have been hurt into caring relationships with other people.”

Victims’ advocates also reserve some doubt about the new platforms. Anuradha Dugal, director of community initiatives and policy at the Canadian Women’s Foundation, would like to see the websites equipped with up-to-date resources for women who might want to speak with a flesh-and-blood counselor after chatting with a bot.

Open this photo in gallery:

Spot co-founder Julia Shaw is a memory scientist at University College London.

Dugal is also concerned about data leaks and about where the data will end up. “We put our banking information out there, we put our personal lives on Facebook. This is even more personal,” Dugal said. “What’s happening to the data? Is it going to be aggregated? Will it be used for the benefit of other women? Will it improve services? That would be my hope.” (According to Shaw, Spot does not keep reports and will not make money from them. As for Botler AI, Dutt says all communication is protected by bank-level encryption.)

Other critics have questioned whether the bots can tell if someone is lying. Shaw says that Spot can’t, but adds that neither can human beings.

In the end, the technology is not the judge, jury and executioner, but a starting point for investigations led by humans. It will still be up to people, not AI, to adjudicate harassment cases and build more equitable workplaces, according to Lisa Amin, a Toronto labour and human-rights lawyer. “Much depends on execution and follow-through,” she says. “If management doesn’t understand what harassment actually is and what it does to the harassed, there is often no significant improvement.”

Ultimately, the platforms’ creators see their efforts as one more option for victims who have a dearth of them.

The forerunner of tech for victims of sexualized violence is Callisto. Piloted in 2015, the online tool lets postsecondary students document and report incidents electronically to school authorities using secure, time-stamped records, when they are ready. Victims also have a secondary option of notifying school officials only if another student identifies the same assailant. This alerts schools to repeat offenders on campus.

The idea is to empower users with confidence grounded in legal doctrine so that they can take control of the situation and pursue it how they see fit

Ritika Dutt

In the 13 U.S. schools where Callisto is now available, survivors are five times more likely to report at a rate three times faster than the U.S. national average.

Callisto staff consulted with experts who specialize in “forensic experiential trauma interview” techniques, which are designed to elicit the most information from victims without harming them in the process. The language on the website is empathetic, the design soft and friendly. Users are encouraged to take breaks.

“Reporting a sexual assault is one of the most vulnerable moments in a survivor’s life,” said Anjana Rajan, Callisto’s chief technology officer. “The reporting process should bring back a sense of humanity to a survivor, not strip it away.”

As with Botler AI, Callisto was born out of personal pain. Jess Ladd was sexually assaulted in university and created Callisto as a trauma-informed alternative to the criminal justice system, in which rape victims are still routinely shamed and convictions remain rare.

“We can choose to invest in avenues … for survivors to come forward in a way that minimizes their risk,” Ladd told Seth Meyers last November, “or we can lose momentum and do what we’ve done before, which is ask them to martyr themselves in a broken system.”

Follow related authors and topics

Authors and topics you follow will be added to your personal news feed in Following.

Interact with The Globe

Trending