Share
Wire

Sick AI-Generated Child Abuse Images Are So 'Astoundingly Realistic' They're Distracting Investigators from Real Cases

Share

Police agencies and investigators the world over are finding that AI-generated images of child pornography are so incredibly realistic that they have lost time, effort, and money investigating images that aren’t even real. And it is a growing problem.

The Internet Watch Foundation (IWF), an advocacy group based in Cambridge, England, is warning that Artificial intelligence, or AI, is a perfect tool to create “unprecedented quantities” of realistic child sexual abuse images and videos.

The IWF, which is the group that many countries rely on to seek out and remove images of child sexual abuse on the Internet, says that Artificial Intelligence programs are making their job even harder.

Some of these images feature children as young as three years of age.

With the problem growing, IWF chief executive Susie Hargreaves has called on U.K. Prime Minister Rishi Sunak to devote more resources to the AI issue.

Trending:
Facebook Being Used to Facilitate Illegal Immigrants' Infiltration of the US, from Border Crossing to Fake Work Credentials: Report

“We are not currently seeing these images in huge numbers, but it is clear to us the potential exists for criminals to produce unprecedented quantities of life-like child sexual abuse imagery,” Hargreaves according to Sky News.

“This would be potentially devastating for internet safety and for the safety of children online,” she added.

On its own website, IWF added that AI-generated imagery of child sexual abuse is a growing threat and are “so realistic they would be indistinguishable from real imagery to most people.”

The AI images, IWF adds, are “astoundingly realistic.”

[firefly_poll]

The National Crime Agency (NCA) agreed that the problem is “increasing” and must be taken “extremely seriously.”

“There is a very real possibility that if the volume of AI-generated material increases, this could greatly impact on law enforcement resources, increasing the time it takes for us to identify real children in need of protection,” said NCA’s director of threat leadership Chris Farrimond.

AI-produced child porn is already illegal in many places — including the U.K. — but the capabilities of AI are increasing so rapidly, it threatens to overtake policing efforts.

The U.K., though, is working on expanding its criminal laws and is looking to put more onus on Internet providers to shut down sources of AI child porn.

“The Online Safety Bill will require companies to take proactive action in tackling all forms of online child sexual abuse including grooming, live-streaming, child sexual abuse material and prohibited images of children — or face huge fines,” a government representative told Sky News.

Related:
Facebook Being Used to Facilitate Illegal Immigrants' Infiltration of the US, from Border Crossing to Fake Work Credentials: Report

But certain AI and other software are also being used by child abusers to skirt detection systems developed by police and governments to root out the dangerous porn.

“The continued abuse of this technology could have profoundly dark consequences – and could see more and more people exposed to this harmful content,” Hargreaves said.

The Washington Post reported that these new AI programs have created a “predatory arms race” for pedophiles on their dark web forums.

“Children’s images, including the content of known victims, are being repurposed for this really evil output,” said Rebecca Portnoff, the director of data science at the child-safety group Thorn.

“Victim identification is already a needle-in-a-haystack problem, where law enforcement is trying to find a child in harm’s way,” she continued. “The ease of using these tools is a significant shift, as well as the realism. It just makes everything more of a challenge.”

The Post added that many of these AI programs, such as Stability AI, can be copied and used without the company’s safeguards in place, so it is hard for the AI programmers to stop the misuse of their products.

“Private companies don’t want to be a party to creating the worst type of content on the internet,” said Kate Klonick, an associate law professor at St. John’s University, according to the Post.

“But what scares me the most is the open release of these tools, where you can have individuals or fly-by-night organizations who use them and can just disappear. There’s no simple, coordinated way to take down decentralized bad actors like that.”

Artificial Intelligence presents many problems for law enforcement, as well.

“For law enforcement, what do they prioritize?” said Yiota Souras, the chief legal officer of the National Center for Missing and Exploited Children. “What do they investigate? Where exactly do these go in the legal system?”

The FBI is obviously rattled by the increase in AI porn and in an alert posted in June it said they had seen an increase in reports regarding children whose photos were altered into “sexually-themed images that appear true-to-life,” the Post added.

“Technology advancements are continuously improving the quality, customizability, and accessibility of artificial intelligence (AI)-enabled content creation,” the FBI said in its announcement.

“The FBI continues to receive reports from victims, including minor children and non-consenting adults, whose photos or videos were altered into explicit content. The photos or videos are then publicly circulated on social media or pornographic websites, for the purpose of harassing victims or sextortion schemes.”

Advocates warn of the “horrible societal harm” these images represent and say that AI-produced images of child porn are used by pedophiles to inure kids to sexual content to groom them for abuse by using bits and parts of real children to recreate fake images that are incredibly realistic.

“You’re not taking an ear from one child. The system has looked at 10 million children’s ears and now knows how to create one,” Souras explained. “The fact that someone could make 100 images in an afternoon and use those to lure a child into that behavior is incredibly damaging.”


This article appeared originally on The Western Journal.

Submit a Correction →



Share

Conversation