featured-image

By Jason Lim So, Korean boys and young men are taking the photos of their classmates or acquittances and superimposing them onto pornographic videos or photos for personal titillation, group sharing, and even distribution for money. Such pornographic deepfakes are old; however, the recent advancement of AI has made creating such deepfakes much more accessible and seamless. Gone are the days when deepfakes were obviously fakes.

These days, a few minutes on a free app can produce deep fakes that look indistinguishable from the real thing except for the fact that it’s not. Add to this the anonymity of encrypted group chat programs like Telegram that makes discovery by law enforcement very unlikely. The only way that a woman is made aware that she has been victimized is when someone, for whatever reasons, show her the deepfakes that feature the victim.



I can’t imagine the ickiness that the victim would feel upon knowing that her likeness, without her knowledge or consent, has been abused in such a fashion, fantasized over by trusted acquaintances and total strangers alike. It’s definitely enough to lose faith in all men, if not humanity. Korea actually has laws on the books that makes creating deepfakes illegal.

In Article 14-2 (Distribution of False Video Products) of the Act on Special Cases concerning the Punishment of Sexual Crimes, it states: “(1) A person who edits, synthesizes, or processes photograph, video, or audio (hereinafter referred to as “photograph, etc..

Back to Fashion Page