skip to main content

How just 20 images of a child can be used to make a deepfake

CEO of hotline.ie, Mick Moran speaks to Claire Byrne about how AI tools can use photos of children to create child sex abuse content. Listen back above.

Parents who share photos of their minor children online could be accidentally giving free rein to people who want to use them in the creation of CSAM, or child sexual abuse material.

This sobering view was aired on Today with Claire Byrne by former Garda and Interpol investigator Mick Moran. Mick is CEO of www.hotline.ie, which, according to its website, is "Ireland's foremost service dedicated to combatting illegal online content, including child sexual abuse material".

Mick explains some of the technology behind the latest developments and the pitfalls parents should try to avoid. He also suggests ways in which families can share photos more safely online.

Mother recording her daughter dancing with the mobile phone

Recent advances in the power of AI tools have opened up new fronts in the battle against the sexual exploitation of children, Mick explains.

Reacting to recent research from Prof. Karsten Maples at the University of Warwick, which shows that a small number of images of children can be used to generate CSAM, Mick says the situation is only going to get worse:

"He says 20 images is all that’s needed. That’s eventually going to get down to one, as AI gets stronger."

Mick explains how AI tools have become "multimodal". This means that programmes like GPT-4 or Gemini can process photos, video, audio and text simultaneously. Mick says these ultra-powerful tools can "learn" from using photos of children to create new content, including images innocently shared on social media. He says once it’s out there, what happens to it is out of your control:

"Fundamentally, any information you share online can be used in ways you never intended."

Filming father and brother playing at home

Mick says that when you share images of minors publicly, that choice is interpreted by some online platforms to indicate "implicit consent", he explains:

"You would imagine that data protection rules would mean that those images couldn’t be used, but if you’re sharing them publicly, that expectation of privacy is gone. The tricky thing that the companies do, not all of them but some of them, is that they see it as implicit consent."

This means that the action of sharing publicly is seen as equivalent to the parent signing a consent form on behalf of a minor, as they might for a medical procedure. Mick says that different data rules apply to images of children online, but public sharing by parents is interpreted as overriding them:

"If a picture of a child is there, they have to be careful of it; it attracts different rules under data protection. However, if you’re a parent and you share a picture of your child, or another child, it is deemed to be implicit consent from the parent that transfers to the child, and therefore they can use the image."

The really big tech companies like Google and Meta are "well-regulated", Mick says, and they are not his primary concern. He’s more worried about the so-called 'shadow brokers’.

These are the thousands upon thousands of companies who quietly collect your personal information every time you do almost anything online, and sell it for profit: "It’s not twee to say that ‘data is the new gold’. It’s a commodity."

Mick says there are safer ways to share family photos – in private groups, for example, where viewing the images is limited to people you personally know. He says the dangers of child sex abuse material are real and need to be discussed, but it’s also possible to share pictures

"It’s important that’s not used to panic parents and to say, ‘Don’t be sharing your pictures of your children online!’ Which I’ve heard some commentators saying. That’s not true. Absolutely share your pictures."

Sharing photos with family abroad is part of life and Mick says he regularly shares pictures with his daughter in Australia. It’s fine as long as you make good use of the privacy settings, he says:

"We all have people abroad. There’s absolutely zero problem with sharing on Facebook or Instagram or anything else; as long as you are limiting who can see it. And you have settings there to do it."

Unfortunately, many people are unaware that the default settings on social platforms are set to public, and will stay that way unless you take the trouble to go in and change them. Mick says he would prefer if the default setting was private:

"All of that should be turned on by default. But it’s not. You have to go in and do it yourself. And there’s the trap that people are falling into."

Mick talks about the kinds of legal changes he says need to be brought in to safeguard images of children from being used in nefarious ways to generate CSAM content. He also talks about his interactions with parents' groups on the subject, which you can learn more about by listening back above.

If you’ve been affected by anything in this interview and you’d like to talk to someone; contact details for helplines can be found here.

Read Next