For eight years, Trent* has worked for Australia’s internet regulator, eSafety, tackling image-based abuse and purging child sex abuse from the web.
“I’m not afraid to say that I’ve burst into tears at the screen,” he told The Feed.
"I think everybody who finds themselves working with child sexual abuse material has a moment at their computer where the reality of what is being experienced by kids who are victims of sexual abuse becomes all too difficult.”
Revenge porn has skyrocketed during the pandemic, with a recent survey showing one in three people had experienced at least one form of image-based abuse.
Reports to eSafety concerning image-based abuse more than doubled during the pandemic, increasing by 114 per cent in 2020 when compared with 2019, according to Trent.
Trent said the for those dealing directly with victims, the stories they hear can be “really harrowing.”
But he claims eSafety provides comprehensive training, counselling and support to its staff to deal with the graphic and upsetting content they are exposed to on a daily basis.
“Every quarter, our investigators all speak to a trained clinical psychologist and any issues that have been surfaced through those consultations are funnelled back to me as a manager so that I can take remedial action,” Trent said.
"We really encourage time away from the screen, for them to do activities during the day, like, go for a run, go to the gym, to make sure that they are taking care of themselves, which is a really important part of it.”

A survey found 1 in 3 Australians have had intimate images of themselves shared without their permission. Source: AAP
Jason* told The Feed he works as an investigator at Internet Removals. A company, he claims, helps wipe content from the web and social media, including image-based abuse.
He told The Feed that due to his line of work, he’s no stranger to verbal abuse and threats.
This task can be a risky one for those in the trenches. For their own protection, The Feed has chosen not to include Trent or Jason’s real names in this story.
“We try and retain as much anonymity as possible, just to protect ourselves from attacks [from perpetrators of image-based abuse],” Jason said.
"We distance ourselves from leaving traces of our involvement for our safety and to protect our clients’ identities,” he added.

Reports to eSafety concerning image-based abuse more than doubled during the pandemic. Source: Westend6/Getty Images
Jason said there is an abyss of image-based abuse online and fighting it can be an “uphill battle”.
The problem is so endemic that image-based abuse is now shared not just on shadowy online forums but on messaging platforms, according to Jason.
The Feed has previously reported on an international forum that hosts thousands of nude images of Australian women and underage girls. These images have been uploaded without victims’ consent or knowledge.
The forum reappeared last year after being shut down by the FBI and Dutch authorities in 2018.
Trent told The Feed he dealt with a considerable number of complaints from girls under the age of 18 who spotted themselves on the forum.
Jason claims his company also played a part in removing images from this site. He said “all it takes is one image” for it to spread like wildlife and be shared all over the web.
Trent said about one in three reports made to eSafety about image-based abuse involve victims under the age of 18.
However, child sexual abuse material and image-based abuse can be reported anonymously to eSafety. Trent said the regulator has a success rate of over 80 per cent when it comes to getting image-based abuse taken down.
“We make it really clear to people as well that we don't encourage anybody to seek out child sexual abuse material because of the severe legal consequences that follow from that,” he said.
eSafety uses a number of different methods to crack down on image-based abuse.
When it comes to child sex material, the regulator consults with police to issue takedown notices. However, this only applies when the content is hosted in Australia - which very little of it is, according to Trent.
“Almost all of the content that we encounter, hosted overseas, more than 99 per cent of them concern child sexual abuse material,” he said.
While eSafety doesn’t have any direct takedown power in relation to overseas hosted content, it plays a leading role in an international network of hotlines called ‘INHOPE’.
“It's a really effective way of achieving rapid takedown, no matter where it's hosted,” he said.
eSafety can also issue a compulsory removal notice to the person or website who shared the material without consent.

The Feed has spoken to dozens of women who are victims of image-based abuse and have had images uploaded onto an international forum Source: Getty Images
This “means that there are various penalties that could flow from non-compliance that could result in a civil penalty order or an infringement,” Trent said.
If we were in a situation of particular seriousness and we were concerned that that person was going to continue sharing content, we can apply to the federal court for an injunction restraining that person from doing certain things.”
Trent told The Feed helping victims is the most important aspect of his job.
“When you are removing the ability of offenders to share that material, gain gratification from that material, we know that you're having a direct effect on the wellbeing of victims, even if you've never met them before,” Trent said.
He said eSafety has helped thousands of people who’ve often contacted the regulator as a “last resort”. “Knowing that we've been able to say them that’s something we can actually assist with. [It’s] really, really rewarding work.”
You can report image-based abuse to eSafety at esafety.gov.au.
If you or someone you know is impacted by sexual assault or family violence, call 1800RESPECT on 1800 737 732 or visit www.1800RESPECT.org.au. In an emergency, call 000.