How AI is being misused for child pornography


Exclusive

As of: January 5, 2024 6:08 am

Artificial intelligence is increasingly being misused to create child pornography Full screenResearch shows. Some images are artificially created, while others are based on actual images of children or actual child abuse.

Fabian Sigurd Severin, SWR

Thousands of artificially created images of children and teenagers are distributed via the social media platform Instagram under certain hashtags. Many are depicted in skimpy underwear, swimsuits, or sexualized looks. This is what research shows SWR– Forms of inquiry Full screen.

Some accounts that share such images are linked to commerce, crowdfunding or social platforms. Images of explicit child sexual abuse created using artificial intelligence (AI) are sometimes distributed there. Experts warn of the danger of AI-generated false abuse images: they make investigative work more difficult and may increase the willingness of pedophiles to commit real attacks.

Trade AI-generated images Child abuse

On linked social sites Full screen-A Japanese website shares explicit AI-generated images of child abuse, according to research. It is also known to BKA and according to user comments, it seems to be used on the network by people with pedophilic disorder.

Many users link from there to a website with real images of child abuse. This is confirmed by the “Internet Watch Foundation”, an international reporting center on child sexual abuse in Great Britain.

AI-generated films based on real child abuse are also circulating. This is indicated by user comments under AI-generated images and observations by the Internet Watch Foundation.

missing Safety measures

“People either sell access to such images or get paid to create them,” says Dan Sexton of a British NGO. The organization has so far found the majority of AI-generated images of child abuse on the dark web. According to Sexton, the risk increases as the number increases, the more of them. “It's not something that might happen in the future, but something that's happening right now.”

See also  French Presidential Election Before Macron-Le Pen Fight - Politics

Artificial images of child and youth pornography are created Full screen-According to the research, primarily with a stable spread version of the AI ​​program. DALL-E contrasts two major competing AI projects and mid-journey, Used to create images, Standard Spread is open source software and its code is publicly accessible. The software version does not have any security mechanisms to prevent the creation of nude images. This is shown by a self-examination Full screen– Editorial staff.

Potential burden on authorities

The BKA does not record AI-generated pornographic images individually, but overclassifies the overall risk of child and youth pornography. By 2022, the number of child pornography cases will increase by 7.4 percent and youth pornography by 32.1 percent. Additionally, everyday, real photos can serve as the basis for AI-generated porn. According to the BKA, synthetic images are rarely distinguishable from real images.

“What worries me is that the number of available materials will increase, but above all the quality of the content will increase,” says Marcus Hartmann, a senior public prosecutor. He heads the Cybercrime Central and Contact Point (ZAC) in North Rhine-Westphalia. According to Hartmann, this development could lead investigators to misjudge AI-generated images as new real abuse, thereby reaching their resource limits.

TO-Child pornography May provoke criminals

But exposure to artificially created child pornography is also a risk for people with pedophilic disorders, says Professor Klaus Michael Beier, director of the Foundation's Institute of Sexual Science and Sexual Medicine in Berlin. He also runs “Don't Become a Perpetrator”, a support service for pedophiles in Berlin.

See also  Diplomats attack the Republican Party

The problem: Like real child pornography, AI-generated images can lead to perceptual distortion, says Beier. They tricked pedophiles into thinking that sex between children and adults is possible and that even children would like it.

His warning is supported by an international study by the Finnish NGO “Protect the Children”. More than 8,000 people using images of child abuse on the darknet participated. A third of those surveyed said they actually tried to connect with children after seeing the illustrations.

Possible legal loophole in Germany?

In Germany, according to section 184 b and c of the Criminal Code, the distribution, acquisition and possession of pornographic images of children or young people is prohibited. But since there is no case law yet on AI-generated child and youth pornography, this creates uncertainty, says Rhineland-Palatinate's Justice Minister Herbert Mertin (FDP): “One problem is simply producing it without distributing it. If they do it with artificial intelligence It should go unpunished. It's punishable if you do it with real kids.”

So the conference of justice ministers asked Federal Justice Minister Marco Buschman (FDP) to set up a group of experts to deal with the new developments, Mertin says.

Required by EU law Platform provider

A spokesperson for the Union Ministry of Justice writes Full screenDemand that the “criminal law instruments” be continuously studied. Apart from the real ones, “indecent images of children and young people continuously generated using AI” are punishable. If illegal content is distributed on online platforms such as Instagram or the aforementioned Japanese website, the Digital Services Act applies. EU law obliges the sites. To set reporting procedures and take action against improper use of their services.

See also  Hero General Assured: "We will soon be on the Russian border" | Politics

Spokespeople for Instagram and the Japanese social platform are standing by themselves Full screenA clear investigation against child pornography. Instagram doesn't just take action against overtly sexual content, it also takes action against profiles, pages or comments that share images of children without obvious sexual connotations if the captions, hashtags or comments contain inappropriate signs of affection.

Platforms are not responsive enough

Full screenHowever, research shows that Instagram and the Japanese website do not sufficiently comply with the obligation to remove illegal content. During the research, the editorial team reported dozens of Instagram accounts whose owners advertised themselves as selling actual child and youth pornography.

Only a third of accounts were deleted by Instagram within 48 hours. For others, no violations of community guidelines were initially detected. The rest of the accounts were deleted only after further notification.

writes a spokesperson for the Japanese website Full screen AI-generated images of child and youth pornography removed. But even after their response, comparable AI-generated images of child abuse can be found on the social platform.

“In Germany we have the appropriate legal framework regulations that can be used to remove something like this from the Internet,” says Rhineland-Palatinate Justice Minister Mertin. “Our problem has always been to make the polluter manageable.”

Many criminals are abroad so it is difficult to catch them. In addition, international cooperation is sometimes difficult. Chief Prosecutor Hartmann sees the problem above all in the fact that it is not easy for platform operators to identify relevant images.

Leave a Reply

Your email address will not be published. Required fields are marked *