How Deepfake Technology Can Become More Dangerous Than a Nuclear Weapon

How Deepfake Technology Can Become More Dangerous Than a Nuclear Weapon

Davis, Thomas | 2020/01/15

“The powers that be no longer have to stifle information. They can now overload us with so much of it, there’s no way to know what’s factual or not. The ability to be an informed public is only going to worsen with advancing deep fake technology.” J. Andrew Schrecker

All of us have heard Donald Trump refer to some television stations as ‘fake news’ and that we shouldn’t listen to them. Half of the people believe him, and the other half don’t. The truth is that we don’t really know what to believe what we see on television or the Internet. The lines have become blurred with everyone having their own agenda as to what they want the audience to see. What if you watched a video or saw a picture of someone you loved and believe in like a certain celebrity or someone running for the President of the United States doing or saying something horrific? Would your opinion change about this person? You bet your ass it would. Now you don’t know that it is a fake video and that this person didn’t really do any of that. But your eyes deceive you and you believe what you see. The video was created by someone using deepfake technology in which the video was manipulated to cause harm to that person. So, what is deepfake anyway? Let’s take a look.

What is Deepfake?

Deepfake is technology people utilize to create fake videos or audio recordings that look and sound like the real thing using neural networks. They make these videos as a joke putting famous people in embarrassing situations like having a picture of a celebrity doing porn or having politicians saying things they wouldn’t normally. This started out as academic research within the field of computer vision around 1997 in the Video Rewrite project. It modified existing video footage of a person speaking depicting the mouth saying the words in a different audio track. The project used machine learning techniques to make connections between the sounds made by a subject and the shape of the subject’s face. Amateur Reddit users picked up on this and began making funny videos with celebrities faces on others’ bodies. The term deepfake was created by a community in Reddit with the same name trading humorous pictures and videos amongst one another. Now that you know what deepfake is let us examine how it works.

How Does It Work?

This technology uses generative adversarial networks (GANs), in which two machine learning models fight it out. The first model trains on a data set and then creates video forgeries while the other tries to detect the forgery. This goes on until the second model cannot detect the fake in the first model. The bigger the set of training data the easier to create a very believable deepfake that would fool people. Researchers from Stanford University, Princeton University, the Max Planck Institute for Informatics, and Adobe Research conducted a few tests to show how easy it is to manipulate and edit these videos and pictures. The scientists combined a number of techniques to create the video fakes and test how easy it is. First, they scanned the target video to isolate the sounds that make up words spoken by the subject. Then they match these sounds with the facial expressions that accompany each sound. Lastly, they create a 3D model of the lower half of the subject’s face. When one of the scientists edits the text transcript of the video, the software combines all the collected information in the three steps above to construct new footage to match input text. It is then copied on to the source video to create the final result. We are in the early stages of this technology and there are some limitations on what this software can do. But there are websites that exist that run their own version of this software that the public can use and play around with.

It’s Always Fun and Games Until Someone Gets Hurt

This is what my mother always used to say to me I would be horsing around with my friends and one of us began to get mad. This is where the fun turned aggressive and tempers flared. Any new technology is fun and enjoyable in the beginning when everyone is playing around with it making others laugh. But sooner or later, the wrong person will try to figure out a way to use this shiny new toy to perform bad acts. History has shown us that this is inevitable. Deepfake is new on to the scene and still in its infancy and is more of a hobby for people. But in the coming years when better technology comes out and someone figures out how to make this evil it could be potentially devastating for all of us. Could you imagine someone posting a deepfake video that depicts the President of the United States saying that a massive nuclear attack is on the way or an unstoppable computer virus that robs all of our money or an outbreak of an incurable disease threatening to wipe out humanity? There would be enormous panic all over the country and we would not have a clue what to do. Riots and looting would happen as people tried to flee the country looking for safe haven.

Conclusion

As of now deepfakes are a fun new toy for people to play around with and be a nuisance. They can go and post Brad Pitt’s head on an actor shooting a porn or Joe Biden saying something detrimental to his campaign. These videos are pretty easy to spot and tell they are fake. But as this technology matures and the wrong person or people get hold of it the consequences could be disastrous. Senator Marco Rubio called them the modern day equivalent saying “In the old days, if you wanted to threaten the United States, you needed 10 aircraft carriers, and nuclear missiles, and long-range missiles. Today, you just need access to our internet system, to our banking system, to our electrical grid and infrastructure, and increasingly, all you need is the ability to produce a very realistic fake video that could undermine our elections, that could throw our country into tremendous crisis internally and weaken us deeply.” Whatever may come of deepfake videos we should definitely be wary and watchful as to how this is being used in the future. The potential disaster this technology could wreak havoc on all of us in the United States is something we may never see coming.