The agency of K-pop group TWICE warned about the spread of deepfake videos involving the members. South Korea has been grappling with the spread of deepfake porn videos. The police have started investigating reports of deepfake porn videos in schools nationwide including elementary.

TWICE (Instagram) Names of 300 schools in South Korea were posted on a Telegram chatroom for reporting deepfake cases, the Korea JoongAng Daily reported. The police have started their investigation in Seoul, Incheon and South Jeolla. In a notice, JYP Entertainment, TWICE’s agency, said, “We are gravely concerned about the recent spread of deepfake (AI-generated) videos involving our artists.

” “This is a blatant violation of the law, and we are in the process of collecting all relevant evidence to pursue the strongest legal action with a leading law firm, without leniency,” the agency said. It warned, “We want to make it clear that we will not stand by while our artists’ rights are violated and will take decisive action to address this matter to the fullest extent possible.” “A deepfake refers to a specific kind of synthetic media where a person in an image or video is swapped with another person's likeness,” according to a July 2020 article posted on the Massachusetts Institute of Technology (MIT) Sloan School of Management website.

It added, “The term ‘deepfake’ was first coined in late 2017 by a Reddit user of the same name. This user created a space on the online news .