top of page
Writer's pictureJon Lui

K-pop fans plead with K-pop agencies to protect their artists from deepfake pornography sites

The recent trend of using deepfake technology for "deepfake pornography" has become a serious issue in South Korea. We uncover the K-pop groups targeted by this serious crime and what netizens have been saying.




Hashtags #YG_PROTECT_BLACKPINK and #ADORPROTECTNEWJEANS have been trending on X (Twitter) with K-pop fans pleading with K-pop management companies to protect their artists' from circulating on Telegram group chats. Numerous K-pop groups have also become victims of this illegal use of their image, including the rookie girl group BABYMONSTER.






Deepfake technology uses AI technology powered by two machine learning models to take sample imagery, audio, and video to create a new piece of media. In the case of "deepfake pornography", the machines take photos of K-pop idols' faces and "paste it"/"combine it" with another person's body.


South Korea is currently battling a "deepfake pornography" battle as one telegram account had 220,000 members "creating and sharing doctored images and videos", according to The Guardian. You can see where the different IP addresses within South Korea have spread below:






South Korean president Yoon Suk Yeol has ordered a crackdown on digital sex crimes targeting women and girls who have become unknowingly deepfake victims. According to police data, "297 cases of deepfake crimes of a sexual nature were reported in the first seven months of the year — up from 180 cases last year." (The Guardian) The current South Korean law states that making sexual explicit deepfakes with the intention of distributing them is punishable by five years in prison and a fine of 50 million won (USD 37,000).


So what are your thoughts? Do you think K-pop companies are able to protect their artists' image from deepfake pornography?


Source: 1



Comments


bottom of page