Man using laptop
The man was sentenced to two- and-a-half years.
  • A man in South Korea was jailed for creating sexually explicit images of children.
  • He was sentenced to two-and-a-half years in prison in a landmark case, The Korea Herald reported.
  • A Spanish town has been rocked after an AI app was used to create fake naked photos of girls.

A South Korean man was jailed for using AI prompts to create sexually explicit images of children, The Korea Herald reported, citing local outlets.

The unnamed man, aged in his forties, was charged with violating the Act on the Protection of Children and Youth and was sentenced to two-and-a-half years in prison.

The man is alleged to have used an image-generating AI program to create about 360 sexual images, the prosecutor's office told CNN. He did not distribute the images, which were confiscated by police.

He used prompts such as "10 years old," "nude," and "child" to generate the images, which the court ruled were life-like enough to be able to dismiss the defense's claims that they could not be deemed to be sexually exploitative, The Korea Herald reported.

The court's decision showed that fake but realistic images of minors created with "high level" technology could be considered as sexually abusive content, the prosecutor's office told CNN.

Earlier this month, a small town in Spain was rocked by the creation and distribution of AI-generated naked images of young girls in several of its schools.

More than 20 girls aged 11 to 17 have been identified as victims in the town of Almendralejo, in the western Extremadura region.

The deepfake images were reportedly made using made using Clothoff, an AI-powered app that allows users to "undress girl for free." They'd mainly been created using photos taken from the girls' social media accounts that had showed them fully clothed.

Deepfake technology is where AI is used to generate new and realistic video, audio, or imagery to portray something that didn't actually happen.

The technology has a murky past and has often been used to create non-consensual pornography, as an FBI public service announcement from June 2023 warned.

"The FBI continues to receive reports from victims, including minor children and non-consenting adults, whose photos or videos were altered into explicit content," the agency said, adding that this content was often then "publicly circulated on social media or pornographic websites, for the purpose of harassing victims or sextortion schemes."

The South Korean prosecutor's office did not immediately respond to a request for comment from Insider, made outside normal working hours.

Read the original article on Business Insider