Fighting abusive deepfakes: The need for a multi-layered action plan
Deepfakes are manipulated or synthetic audio or visual media that seem authentic, and which feature people that appear to say or do something they have never said or done, produced using artificial intelligence techniques, including machine learning and deep learning. Deepfake production relies on an innovative deep learning technique called ‘generative adversarial networks’ (GANs), which can increase both the degree of automation and the quality of the output compared to conventional techniques. GANs generate deepfakes by pitting two AI agents – also described as artificial neural networks – against each other. While the producer agent learns to create fakes that look just like standard recordings, a detector agent learns to identify whether a media product is fake or authentic.
The basis for all deepfake software is an artificial intelligence algorithm called ‘deep learning’, which needs a massive amount of data to learn how to replace a face within a video. The number of readily available images of celebrities on the web is why they have quickly become the most prominent targets in the production of fake videos.
Hollywood actress Bella Thorne is one of the most deepfaked people in the world: videos and pictures of her have been virtually altered and edited into a vast number of other videos. All of these deepfakes were non-consensual, and most have been abusive. Hackers have used the deepfakes to harass and blackmail Thorne, who has reacted by releasing intimate footage of herself.
A 2020 study by digital researcher Sophie Maddocks shows that, in certain instances, deepfake creators deliberately use manipulated videos to silence famous personalities. Maddocks mentions that Thorne was harassed because she had spoken out against sexual violence. According to the study, the prevalence of pornographic deepfake videos is underrepresented in both media coverage and research because most outlets and researchers focus on the abuse of fake videos for political ends.