(OSV News) -- With Pope Leo XIV prioritizing the issue of artificial intelligence, one expert warned OSV News that generative AI poses serious -- and largely unaddressed -- threats to child safety, by accelerating the creation and distribution of child sex abuse material, or CSAM.
"I've described it (CSAM) as a worldwide pandemic, and generative AI is the next version of it," said Greg Schiller, CEO of the Child Rescue Coalition, a nonprofit that develops technology for law enforcement for prosecution of child predators.
The term AI broadly encompasses various forms of technology by which machines can mimic human learning, problem-solving and creativity.
Among those are "plenty of AIs that will search the internet for pictures and then take that information and morph it into whatever you want," said Schiller, a longtime prosecutor specializing in cases of sexual exploitation and human trafficking.
While such usage may represent "a legitimate thing," he said, generative AI, or GAI, gives predators a dark new potential.
"You could have a text prompt and somebody would input, 'How can I find a 5-year-old little girl for sex? Tell me step by step,'" Schiller said. "AI is going to search every aspect of the internet until it finds something."
And, he said, GAI -- described by IBM on its website as "deep-learning models that can generate high-quality text, images and other content based on the data they were trained on" -- is used by predators to mine images of children posted online by parents, parishes and schools.
The technology even allows predators to target certain images for manipulation, said Schiller.
"You can go to any number of generative AI websites and point (them) specifically at a website just by typing," he said. "All you're doing is typing instructions to it, as if it was the smartest human being on planet earth and could move at light speed, and you (can) say, 'Hey, let's look at ABC Church and find photographs of young kids at that church and return those to me.'"
Schiller said the "public facing website" of any organization "is completely searchable."
And, he said, "once the AI finds those images, the generative aspect of AI can be told, 'OK, now taking those images, I want you to cause them, with those children's faces, to do x, y and z sexually.'"
The U.K.-based Internet Watch Foundation -- a nonprofit which for the past three decades has worked to protect children from online harm -- highlighted the accelerating use of GAI in the creation of CSAM in two key reports the organization released in October 2023 and July 2024.
The latter publication found an increase in "more severe images … indicating that perpetrators are more able to generate complex 'hardcore' scenarios."
"With GAI, the offender creates … the most sadistic form of CSAM," said Schiller.
The IWF also said that "AI-generated child sexual abuse videos, primarily deepfakes" -- false images, audio and videos created with AI to manipulate subjects' appearances and actions -- "have started circulating, highlighting rapid technological advancements in AI models/generators."
The organization said, "Increasingly, deepfake videos shared in dark web forums take adult pornographic videos and add a child’s face using AI tools."
In addition, IWF detected "noticeable increase" in AI-generated CSAM on the "clear web," also known as the surface web or clearnet -- the publicly searchable portion of the internet, in contrast to its darknet counterpart, which consists of decentralized networks that cloak anonymous and often illicit activities.
IWF said its data showed "perpetrators increasingly use fine-tuned AI models to generate new imagery of known victims of child sexual abuse or famous children."
According to the National Center for Missing and Exploited Children, CSAM perpetrators can also use GAI to create fake social media accounts, through which the predators lure children online.
Such material can be deployed by predators to "normalize" sexual abuse while enticing new victims, added Schiller.
NCMEC notes that offenders can use GAI to create explicit images of a child, and then blackmail the child into producing more sexual content and forwarding money, a coercion known as "sextortion."
The organization said on its website that it has seen "cases in which the child refuses to send a nude image to the offender, and the offender then creates an explicit GAI image of that child to blackmail them for more explicit images."
"And there's nothing Johnny can do, because it looks just like him," Schiller said. "He's not going to tell Mom and Dad, because he's embarrassed, and kids try to fight these things off on their own. Sometimes this extortionist will say, 'Pay me $100 a week, and until and as long as you're doing that, I won't release it."
In other situations, GAI is used by a child's peers to bully victims, said NCMEC.
GAI-generated CSAM can also tax resources dedicated to finding exploited children, Schiller said, since it stands to become "a rabbit hole with no end" if law enforcement has to try to "identify a child that doesn't even exist, when they could be rescuing another child."
Predators looking to sexually exploit children "are often early adopters of new technology and use technological developments to exploit and endanger children in ways that current laws may not anticipate," said John Shehan, NCMEC's senior vice president of the organization's exploited children division, in his March 12, 2024, testimony before the U.S. House Subcommittee on Cybersecurity, Information Technology, and Government Innovation.
Greater legal and policy protections can mitigate some of the risks, say experts.
On July 22, Bishop William D. Byrne of Springfield, Massachusetts, and Bishop Robert E. Barron of Winona-Rochester, Minnesota, sent a letter of support to lawmakers who have reintroduced the Kids Online Safety Act.
The bill "takes meaningful and effective steps" toward reducing internet dangers, said Bishop Byrne and Bishop Barron, who respectively chair the U.S. Conference of Catholic Bishops' committees on communications and on laity, marriage, family life and youth.
Schiller noted that more work needs to be done at the state legislative level, since "some states don't have that language" with "enough teeth to prosecute" GAI-produced CSAM, and as a result, "it might be a misdemeanor in some places."
Parental awareness and intervention is crucial to protecting children from the harms of GAI-produced CSAM, said Schiller, who regularly gives internet safety presentations at schools and churches, based on guidelines from NCMEC.