As the line between reality and fake goes on diminishing day by day, Deepfake is just beginning to accelerate the process. An open source software was started by a Reddit user called Deepfakes.
What exactly is Deepfake?
Deepfake allows one to swap a face on a video using artificial intelligence and its just beginning to display what it’s capable of. Deepfake uses multiple images of a person from videos, google images, stock photos, etc and trains the algorithm to swap the faces. It uses open-source machine learning tools like TensorFlow, which Google makes freely available to researchers, graduate students, and anyone with an interest in machine learning.
Deepfake came to news earlier this year and though it might seem that India is far from such technology, we have already fallen a victim to it. Creating a morphed video using Deepfake doesn’t require any high-end studio equipment. The only thing it requires is a one single human being! Of course with a computer which can take the load of running the software.
What can Deepfake do?
Deepfakes helped people create a large number of porn videos of celebrities doing things they would never do in real life. Now, the application has definitely outgrown its limit. It makes it very easy to make a video of a politician doing and saying things, he never actually did. At the time of such political polarization, this can lead to huge consequences.
In an interview with Samantha Cole of the motherboard website, Deepfake (the redditor) said,
“I just found a clever way to do a face-swap,” he said, referring to his algorithm. “With hundreds of face images, I can easily generate millions of distorted images to train the network, after that if I feed the network someone else’s face, and the network will think it’s just another distorted image and try to make it look like the training face.”
One can argue that such videos are not very realistic and easily recognizable as fake due to errors in face swapping. But this is just the beginning of something that can have a huge impact. Technology will only improve in the future; what then?
There is not much you can do if you fall a victim to Deepfake.
If you fall victim to this technology, for an instance a porn video surfaces with your face. There is very less that you can do about it. With barely any knowledge about deepfakes that go live on the mainline media, our law enforcement department is yet to equip themselves with how to deal with such cases. Writing about her own story, Rana Ayuub, a renowned journalist says…
“I couldn’t believe it. I was a woman standing in front of them who had mustered up the courage to file a complaint and they were trying to dodge it. I threatened them. I told them that if they didn’t want to register a complaint, then I would write about them on social media. Finally, after my lawyer told them we would go to the media, they filed the report. That was in April. More than six months later, I haven’t heard a thing from the police. I gave my statement to a magistrate, I gave them all the screenshots, the messages that I received but there has just been absolute silence.”
How does the West deal with Deepfakes?
The scenario is the same in the west with no clear protection for the victims. An article by the wired website explains it clearly.
Face-swap porn may be deeply, personally humiliating for the people whose likeness is used, but it’s technically not a privacy issue. That’s because, unlike a nude photo filched from the cloud, this kind of material is bogus. You can’t sue someone for exposing the intimate details of your life when it’s not your life they’re exposing.
And it’s the very artifice involved in these videos that provides enormous legal cover for their creators. Since US privacy laws don’t apply, taking these videos down could be considered censorship—after all, this is “art” that redditors have crafted, even if it’s unseemly.
Doctored videos do not find their limits in porn anymore. It is, however, a major source of and in some ways the origin of this technology. We all remember the famous Fast and Furious scene of Late Paul Washer. It appears as if he is the one acting but in fact, it is his brother. CJI portrayed it like it had been Paul Washer. Where such technologies come with huge studio costs and production value, Deepfake gives the same opportunity to anyone in a dorm room with just a laptop.
Deepfake: A threat to honest journalism
Talking about the story of Rana Ayuub gives us a clear picture of how a group of people or a political party can use Deepfake videos to defame anyone who stands against them. She was a victim to a morphed porn video. The predators victimized her, posting her face on a porn actress’s face for obvious reasons. She is as an outspoken critic of the BJP government.
Recollecting her encounter with the news when she came to know about it from a friend, she writes…
“What he sent me was a porn video, and the woman in it was me. When I first opened it, I was shocked to see my face, but I could tell it wasn’t actually me because, for one, I have curly hair and the woman had straight hair. She also looked really young, not more than 17 or 18. I started throwing up. I just didn’t know what to do. In a country like India, I knew this was a big deal. I didn’t know how to react, I just started crying….”
She wrote a full article on this incident on The Huffington Post.
Here is a video by Bloomberg explaining Deepfakes.
The uses of a tool such as Deepfake are limitless. Promoting a certain political agenda is one among many of them. It is a threat to the personal security of people all over the world. It is a pretty good advice to know that what you see is not necessarily the truth. This holds undeniably true in the news media industry. Digging deeper and filtering out facts from fake is crucial. We hope that the knowledge of Deepfake encourages you in looking more keenly at the presented “facts”. And we hope it serves you well in keeping you away from malicious cyber predators.