7-month intense crackdown shows celebrities, ordinary people fell victim to deepfake sex crimes

A total of 963 have been accused of committing sex crimes using deepfake technology in the seven-month crackdown conducted by the police, 93.1 percent of whom were in their teens and 20s.
The National Office of Investigation under the National Police Agency on Thursday announced the result of their investigation from Aug. 28, 2024 to March 31, 2025, in which they caught 669 teens, 228 in their 20s, 51 in their 30s, 11 in their 40s and four aged 50 and above on suspicion of using the artificial intelligence-based technology for deepfake sex crimes.
The NOI officials said they requested the deletion of about 10,000 videos created or distributed by the suspects to the related authorities, such as the Korea Communications Standards Commission.
Suspects used images and videos of celebrities, acquaintances or even random women they came across mostly on social media platforms.
One of the suspects is accused of creating and distributing 1,100 deepfake videos of celebrities between August 2023 and March this year. He distributed the sexually exploitive videos through a private chat room on Telegram, which is thought to have around 140 members.
Other suspects include 15 men accused of distributing sexually exploitive deepfake content using the faces of 17 female students at Inha University in Incheon, some of which had their actual names revealed by the perpetrators. The main culprits were two graduate students aged 24 and 31, respectively, who are believed to have collected the victims' information and photos to use for the crimes.
Deepfake sex crimes spreading among Korea's youth
Seventy-two of the teens caught were found to be criminal minors, referring to those under the age of 14 who are immune from criminal punishments.
Studies indicate that young people account for the overwhelming majority of those involved in deepfake sex crimes -- both as suspected perpetrators and victims.
A 2024 report by the Women's Human Rights Institute of Korea showed that 90.3 percent of those victimized by the illegal doctoring of photos and videos in 2023 were in their teens and 20s.
Police in South Gyeongsang Province in November caught two high school boys accused of downloading photos of their middle school classmate and creating sexually exploitive photos using deepfake technology.
The authorities have stepped up efforts to curb the seemingly increasing number of deepfake sex crimes, namely with a law revision in November that will allow preemptive response and undercover investigation of deepfake crimes if the victim is a legal adult, effective from June 4. The revision will allow police officials to actively investigate the crime without revealing his or her identity, using fake identification if necessary.
It also pardons officers being held accountable for possibly illegal actions in the course of investigation, if the action in question is deemed unavoidable in uncovering the crime.
The NPA said they will make full use of the law revision in actively cracking down on deepfake sex crimes, warning that possessing, watching and purchasing the illegal videos will all lead to punishment.
minsikyoon@heraldcorp.com