"K-girl group pornography" is about to spill...AI Increases Unauthorized Use of Photographs


While AI model data, which was learned by mass theft of photos of K-pop girl group members, has been circulated online, the agency is considering responding because it can be abused to generate profits or produce pornography.


According to industries and legal circles on the 27th, CivitAI, a generative artificial intelligence (AI) model sharing site, has a number of data that users have learned images of popular KPOP girl group members such as Newzins, Lecerafim, Ive, Espa, and TWICE.


The posted file is an AI model that can generate deep fake images by applying it to the image synthesis AI "Stable Diffusion" released by Stability AI as an open source last year.



Stable Diffusion's model is largely divided into a key model "Checkpoint" that determines the overall style of painting and an auxiliary model "LoRA" (Low-Rank Adaptation) that determines the face and posture of a person.


Among them, LoRA, which is mainly used to make deepfakes of real people, has a capacity of only tens to hundreds of megabytes (MB), and is relatively easy to produce, so it is easily shared on sites such as Sybit AI or online communities.



Unlike high-performance interactive artificial intelligence such as "Chat GPT," Stable Diffusion is characterized by the ability of ordinary people to operate offline on PCs.


Even people who don't know programming can easily use it and use it

Recently, an interface called "WebUI" (WebUI) has been released that can be easily used even by people who do not know programming. With a PC with a low-end GPU (graphic processing unit) worth hundreds of thousands of won and a little knowledge, anyone can download the AI model with a click and create a high-resolution idol composite photo in one to two minutes from the main room.



The problem is that these AI models can be abused to generate profits or produce pornography. On YouTube, "AI Lookbook" content that displays female images created by image-generating AI is prevalent, and some users are selling obscene materials generated by AI on the sponsoring platform "Patrion."


The K-pop industry is also considering ways to respond to the distribution of AI models using the images of its artists.



An official from a large domestic entertainment agency said, "We are aware of the situation," adding, "If an image produced with generative AI violates the artist's honor or portrait rights, we will request the deletion of the work and consider legal action if necessary."