By Andrew Humphries
Injustice, in its many forms, sits very uneasily with Kate Fitzpatrick, particularly when it’s children who suffer.
As a Human Exploitation Community Officer (HECO) with the Australian Federal Police, Kate is at the coalface in the fight against online child sexual exploitation, a crime becoming more and more prevalent.
Within the Uniting Church Synod of Victora and Tasmania, Senior Social Justice Advocate Mark Zirnsak is also working hard to raise awareness around the rise in this type of crime.
Building on a connection formed earlier this year, Kate and Mark teamed up recently on a road trip around Tasmania, promoting the message that greater community awareness, and action, is needed around online exploitation.
Other areas of exploitation, including human trafficking and forced labour, also formed part of discussions held with Uniting Church Ministers, members and school students.
It was a trip that aligned perfectly with Kate’s AFP role around raising awareness about, and building knowledge on, child protection and human exploitation.
And, bit by bit, her work is having an impact.
“I have a very strong sense of justice and fairness, and I get quite hot under the collar when I see injustice, so I’m drawn to this role because there is real ability to help people,” Kate says.
“A couple of years ago I joined the Joint Anti-Child Exploitation Team because I knew that ultimately I wanted to work in prevention.
“There are a lot of really amazing organisations and people working to combat human exploitation and you see the passion that they have to support and help vulnerable people, and that’s probably what I find really fulfilling about the role.
“I’m HECO for both Victoria and Tasmania, but am based in Victoria, so meeting face to face with people in Tasmania really helped to build relationships and understand the issues that are affecting them.
“Mark was able to introduce me to a number of his contacts and that broadened my understanding and ability to link up with people.”
Kate and Mark’s strong connection was formed after they crossed paths at an Australian Institute of Criminology forum earlier this year.
“Mark was one of the speakers at a webinar on responsible recruitment and, knowing he was in Melbourne, I reached out to him,” Kate says.
“We then met up and talked about our different roles, and I certainly picked up on the fact that he has such extensive experience in social justice advocacy, particularly around the issue of online child sex exploitation, and also human exploitation around forced labour and deceptive recruitment.”
Kate says the unpleasant fact is that officers like her are seeing more and more examples of online child sexual exploitation.
“In Australia we have seen an increase in the number of people affected by this type of crime involving the production, seeking and transmission of child abuse material,” Kate says.
“One of the more recent issues is the sexual extortion of young people, commonly known as sextortion, which has had a huge impact on victims and their families.
“In the past we’ve seen online sexual exploitation having a sexual motivation, but with the sexual extortion offence, generally there is a financial motivation, and the impact is devastating.”
As Senior Social Justice Advocate, Mark says he has been given a mandate by the Synod to work, in collaboration with the Synod Culture of Safety Unit, on raising awareness around online child sexual exploitation.
“We are now hearing more alarming things around the online space, and while it’s our young people who are most tech-savvy, they probably aren’t in tune as such with the risks involved,” Mark says.
“The Synod has given us a clear indication from membership that child safety in the online world should be a priority, and that the minimum disruption to privacy should take place to ensure that safety.
“So, safety has to come first but privacy is still a priority … it’s not about giving power to police or regulators to do whatever they like and have unfettered power, but if there is to be a trade-off it should be safety that trumps privacy.”
Kate stresses that only an all-encompassing approach from all parts of the community will see progress made in tackling sexual extortion and online child sexual exploitation.
“The most important thing is education and having an awareness and understanding of the potential challenges of the internet,” she says.
Knowing how to keep the channels of communication open with children is vitally important, says Kate.
“For teenagers in particular, it’s often a very difficult period in their lives where the communication drops out a bit, so as a parent it’s about keeping those lines of communication open and ensuring that young people understand that no matter what happens they are not to blame, and that they can come to you for help,” she says.
“It’s also important that if they don’t feel comfortable coming to their parents, because they are ashamed of something that has happened, that they have other avenues to go to, and that might be a trusted adult, someone at their school, or one of the support services in the community.
“Teenagers who are victims of sexual extortion need to understand that nothing is so bad that they can’t get help.”
Playing a vital role in the education process, says Kate, is the website ThinkUKnow, which contains resources and advice for parents, carers and educators, children and young people to prevent online child sexual exploitation.
“The really important aspect of ThinkUKnow is that it’s evidence based, with access to real-life intelligence and an understanding of trends and patterns, meaning the information on the website is always up to date and relevant to what’s going on,” Kate says.
“The website is delivered to students Australia-wide from kindergarten through to Year 12, and those sessions fit within the Australian curriculum and contain age-appropriate messages to guide young people in how to safely navigate the internet and how to deal with the challenges of what to do when things go wrong in that space.
“It’s a really good website for families to discover together to start the discussion around cyber safety and some of those conversations needed around sexual extortion.
“These are the conversations that need to be had about what is happening online, so that everyone has a better understanding of what the concerns are.
“Once that conversation takes place, children will be better equipped when something does happen.”
Mark says while educational tools like the website are important, there is also a responsibility on the big tech corporations to lift their game.
“I hadn’t realised how useful ThinkUKnow was and what opened my eyes is the fact that police are continually updating it based on intelligence they are gathering around the harm that is being done online, meaning training is continually updated,” he says.
“But there are limits to the role that education can play in this space, which is why we need the tech companies to provide a safer environment in the first place and not just throw the responsibility back on the users.
“What we are seeing is that governments, including our own, are increasingly being prepared to regulate this … the Morrison government and now Albanese government have been very active in increasing the ability for tech companies to be held to account.
“So, government is saying we’ll give these tech companies a chance to do this responsibly on their own, but if they choose not to they can expect intervention at a government level to ensure the safety of users.”
Kate says she hopes the Tasmanian trip is the first step in what will be a long and strong social justice relationship between the AFP and Uniting Church.
“Where there is an intersection in the areas that we are interested in I see potential for us to collaborate again,” she says.
“Mark has been very valuable in expanding my understanding of human exploitation and social justice, while I have been able to talk to him about some of the law enforcement aspects that are important for him to understand as part of his advocacy.”
Mark, too, believes the Uniting Church will only benefit from the relationship established with the AFP.
“I think it can benefit both of us in terms of the Uniting Church knowing what resources the AFP is making available to the community, and the ability to work with them around protecting people from what are very harmful crimes, and the ability to know where to go when seeing things which are suspicious,” he says.
“The AFP also gathers intelligence and it benefits them to have that link with the community through us, so that people can have the confidence to approach them.
“In terms of online safety, I hope to continue working with them in that space to help inform congregations and the wider church about our policies, particularly around the emergence of AI and how we need to carefully consider whether we should be posting photos of children on our websites.
“The other benefit from our co-operation is what we can learn from the AFP around criminological understanding based on their latest research.
“It’s a helpful dialogue to have.”
When AI is not AOK
By Mark Zirnsak
Like most technologies, the development of artificial intelligence (AI) has beneficial uses but, in the wrong hands, it can also be used to inflict great harm.
AI is developing rapidly and it is now nearly impossible to tell the difference between a real image and one manufactured by AI.
AI-generated video technology is not far behind.
The relevance to congregations, communities of faith and church-based agencies is that any publicly accessible image of a child or small group of children can now be used to produced AI-generated child sexual abuse material with the child’s face and body shape.
Congregations, communities of faith and church-based agencies would be wise to not post publicly accessible images of a child or small group of children online where the faces of the children are clearly shown.
To do so exposes the children to a small but growing risk the images may be misused to produce AI-generated child sexual abuse material.
This material can also be used in the sexual extortion of children, and a child threatened with AI-generated sexual images of them can be coerced to provide real sexual abuse images of themselves.
The Australian Centre for Counter Child Exploitation has reported that the sexual extortion of children where the abuse is then captured by camera generates between 60-70 per cent of the referrals to its Victim Identification Unit.
Recent investigations have uncovered the existence of organised sexual extortion groups.
These groups operate across borders and use call centre-like operations in order to communicate with hundreds of potential victims at once.
In December 2022, the US FBI issued a public safety alert about an “explosion” of financial sexual extortion schemes targeting children and teenagers.
The number of reports the US National Centre for Missing and Exploited Children received on that form of cybercrime increased by 7200 per cent between 2021 and 2022.
The negative psychological impacts of sexual extortion include feelings of low self-esteem, withdrawal, worthlessness, anger and guilt.
In some cases, victims have engaged in self-harm or killed themselves.
In Victoria alone, 13 young Victorians have ended their own lives as a result of online sexual extortion or image-based abuse in the last decade.
There are apps accessible online to anyone that can take a photo of any child or adult and produce a naked version of the person.
The Australian eSafety Commissioner stated of such apps:
It is difficult to conceive of a purpose for these apps outside of the nefarious. Some might wonder why apps like this are allowed to exist at all, given their primary purpose is to sexualise, humiliate, demoralise, denigrate or create child sexual abuse material of girls according to the predator’s personal predilection. A Bellingcat investigation found that many such apps are part of a complex network of nudifying apps owned by the same holding company that effectively disguises detection of the primary purpose of these apps in order to evade enforcement action.
In March 2024, it was reported that the ClothOff app had more than four million monthly visits and invited users to “undress anyone using AI”.
It charged $16.50 for 25 credits and did not provide meaningful age verification for users.
Many AI apps will provide nude images without the consent of the person being portrayed. Many allow for free trials before charging for ongoing use.
The UK-based Internet Watch Foundation (IWF) reported that one guide for child sexual abuse perpetrators contained a section explicitly encouraging perpetrators to use ‘nudifying’ tools to generate material to subject children to sexual extortion.
The author of the guide claimed to have successfully blackmailed 13-year-old girls into sending ‘intimate’ images.
The IWF has also reported child sexual abuse perpetrators on the dark web, seeking advice and tutorials from others about how to generate AI child sexual abuse material.
The AI-generated child sexual abuse images can also mean police waste valuable resources seeking to rescue a child from sexual abuse who is not being subjected to real-world sexual abuse or who does not exist at all.
There are also online games where participants can sexually abuse AI-generated children who have the faces of real children they have obtained.