(Screenshot captured from the presidential office's website)
The recent controversy triggered by artificial intelligence chat bot Lee Luda, which became the subject of sexual abuse among some users, is fueling debates spanning from the sexualization of celebrities to AI technologies used for cybercrimes.
A day after a presidential petition called for punishment for fan fiction sexualizing underage male K-pop idols, another presidential petition shed light on female celebrities falling victim to nonconsensual pornography produced using deepfake technology.
The petition demanding punishment for those producing and consuming deepfake porn had collected more than 253,000 signatures as of 5 p.m., within hours of it being uploaded on the presidential office’s website.
The presidential office gives a formal response to petitions signed by more than 200,000 people.
Deepfake is a term used to describe digitally manipulated videos, that use artificial intelligence to look realistic. The technology can be used to commit digital sex crimes by grafting a person’s face onto pornographic pictures or videos.
Nearly 96 percent of all deepfakes online were used in nonconsensual pornography, where the faces of victims were added to the bodies of performers in explicit videos, according to a study released in October last year by Dutch cybersecurity startup Deeptrace Labs.
Nearly 25 percent of the women targeted were classified by the researchers as South Korean musicians or K-pop singers, according to the study.
“Many female celebrities are suffering from deepfake technologies,” the petitioner said. “Deepfakes constitute sexual assault. Female celebrities became the subjects of sexual crimes and deepfake videos involving them are illicitly being sold.”
“Many of those suffering from it are underage celebrities who just began their career. I urge for heavy punishment and investigation into those using deepfakes in this situation where they (underage celebrities) are being publicly exposed to such brutal sexual crimes,” the petitioner said.
Deepfakes came to light in Korea following the infamous “Nth room case,” in which users paid to see naked photos of young girls and their coerced sexual acts on the encrypted messaging app Telegram. There was also reported to be a chat room through which deepfake pornography produced with female idols’ images was distributed and sold.
The issues of real person slash and deepfakes involving K-pop idols and celebrities are sending shockwaves through the entertainment scene.
“The issues of real person slash and deepfakes are not a fandom culture. They are not different from crimes,” an official from an entertainment agency was quoted as saying by Korean business daily Maeil Business Paper. “We passed the stage where it can be considered a fandom culture as artists are taking a direct hit.”
The presidential petition calling for punishment for those creating “real person slash” fan fiction that sexualizes underage male idols is also garnering support, triggering questions as to whether it can be seen as a form of idol culture or if it constitutes a cybercrime.
The petitioner labeled those producing and consuming real person slash as perpetrators of sex crimes, urging the government to punish those consuming such content and to draw up measures to regulate the distribution of such “sexual crime” fiction.
More than 178,000 people had signed the petition as of 5 p.m. Wednesday.
Real person slash is a subgenre of fan fiction depicting homosexual relationships involving real people, and often involves relationships and sexual acts between K-pop idols. Some of its content includes scenes of rape and other sex crimes. Such works can be accessed via social media, in many cases for free, or on Postype, a paid platform for creative content.
By Ock Hyun-ju (
laeticia.ock@heraldcorp.com)