New chatbot goes online to fight image-based abuse

New chatbot goes online to fight image-based abuse

Experts in abusive online behavior say their world-first AI chatbot will help people report incidents of image-based abuse and find support.

Image-based abuse – when someone takes, shares or threatens to share nude, semi-nude or sexual images or video without consent – has become a growing issue, experienced by 1 in 3 Australians surveyed in 2019. 

Lead researcher behind the creation of ‘Umibot’, Professor Nicola Henry from RMIT's Social and Global Studies Centre, said ‘deepfake’ content (fake videos or images generated using AI), incidents where people are pressured into creating sexual content and being sent unsolicited sexual images or videos also count as image-based abuse. 

“It's a huge violation of trust that’s designed to shame, punish or humiliate. It’s often a way for perpetrators to exert power and control over others,” said Henry, who is an Australian Research Council Future Fellow. 

“A lot of victim-survivors we talked to just want the issue to go away and the content to be taken down or removed but often they don’t know where to go for help.” 

That is what this pilot chatbot is here to address.  

The idea came to Henry after conducting interviews with victim-survivors about their experiences of image-based abuse. 

While the people she spoke to had diverse experiences, Henry said they often did not know where to go for help and some did not know that what had happened to them was a crime.  

“The victim-survivors we interviewed said they were often blamed by friends, family members and others and made to feel ashamed, which made them even more reluctant to seek help,” Henry said.

Dr Alice Witt, an RMIT Research Fellow working on the project with Henry, said Umibot is not a replacement for human support, but it is designed to help people navigate complex pathways and provide them with options for reporting and tips on collecting evidence or how to keep safe online. 

“It is not just for victim-survivors,” Witt said. 

“Umibot is designed to also help bystanders and even perpetrators as a potential tool to prevent this abuse from happening.”

A person texting on their phone by the water. Image-based abuse has become a growing issue, experienced by 1 in 3 Australians surveyed in 2019. Credit: Adobe Stock.

How does Umibot work?

Users can type questions for Umibot, or they can select answers from a set of options. 

Umibot also asks users to identify whether they are over or under 18 and if they need help for themselves, help for someone else, or are concerned about something they have done. This will inform what sort of support and information they get to suit their experiences.

Henry says Umibot is the first of its kind that is dedicated to victim-survivors of image-based abuse.  

“There are other chatbots out there that more broadly help people who’ve experienced different online harms, but they are not focused on image-based abuse and they don’t have the same hybrid functionality that allows users to type questions to the chatbot,” Henry said.

A screenshot of Umibot's interface. A screenshot of Umibot's interface.

A new approach to chatbot design

Created with the support of an Australian Research Council Future Fellowship grant, Henry and Witt worked with Melbourne-based digital agency Tundra to create Umibot using Amazon Lex, an artificial intelligence service for building natural language chatbots.

“We know victim-survivors of image-based abuse face a spectrum of experiences over and above image-based abuse, so we developed Umibot as a fully inclusive and trauma-informed empowerment tool to support people who have diverse experiences and come from different backgrounds,” Henry said.

The team also worked with a diverse range of consultants and did an independent accessibility audit to make sure Umibot was as compliant as possible with global accessibility standards for people with disabilities.

“Our main ethical challenge was to make sure Umibot didn’t cause any harm or trauma, or make the user feel burdened,” Witt said.

“A lot of victim-survivors are not ready to talk to a person about their experiences, so teaching Umibot how to be empathetic and helpful is a way for them to seek support without any pressure.”

Next steps for Umibot

With Umibot available to use right now, the researchers are hoping to develop a Umibot Version 2 for victim-survivors, bystanders and perpetrators of image-based abuse in the next few years. 

“We hope that Umibot will not only empower victim-survivors to find support, but also help us create 'best practice' guidelines for designing, developing and deploying digital tools and interventions for addressing online harms more broadly,” Witt said. 

You can access Umibot here.

If this article has raised issues for you, or if you’re concerned about someone you know, call 1800RESPECT on 1800 737 732. In immediate danger, call 000.The Conversation

 

Story: Shu Shu Zheng

Share

  • Research
  • Society

Related News

aboriginal flag
torres strait flag

Acknowledgement of Country

RMIT University acknowledges the people of the Woi wurrung and Boon wurrung language groups of the eastern Kulin Nation on whose unceded lands we conduct the business of the University. RMIT University respectfully acknowledges their Ancestors and Elders, past and present. RMIT also acknowledges the Traditional Custodians and their Ancestors of the lands and waters across Australia where we conduct our business - Artwork 'Sentient' by Hollie Johnson, Gunaikurnai and Monero Ngarigo.