There you see yourself. In a video, starring in a hardcore porn film. You can practically feel the stares of your colleagues on your skin. The whispers grow in your head into screams that must be heard throughout the building complex - everyone already knows. Everyone sees you there. In that porn. A circular email that has already been sent to all your colleagues. Also to you. Followed by an email from your boss. It starts with: “We regret to inform you...“. Only you know: That's not me. Even though you are clearly visible there. That's not me in this video....
How is that possible? You have become a victim of a so-called deep fake video. The person in the video is not you, of course, but this person has your face which is built into a porn film using computer technology. You think this can't happen to you because you hardly ever post pictures of yourself online? Make no mistake. The technology used to make deep fakes is improving all the time, and the latest experiments show that - with the right processing power - videos can be made using just one photo of you. In the age of social media, are you sure that none of your friends, family and acquaintances have posted a photo of you?
A problem for security
Deep fakes are increasingly becoming a security problem, according to the FBI, because deep fakes are not only fake videos, but also fake texts and voice messages. Fraudsters take advantage of these to defraud insurance companies, for example, or people who want to help their loved ones out of their savings.
The FBI also assumes that in the future foreign intelligence services and criminals will increasingly use computer-generated fakes to penetrate computer networks.
Deep fakes can also be used to easily manipulate people by creating fake campaigns with leading politicians: Replace head and voice, spread via social media - done. The problem is that very few people question such videos and believe what they see.
The psychological damage
Technology is constantly evolving and there are a number of apps, websites and software on the market that can now be used to commission deep fakes for little money or simply create them at home - often to the detriment of others. The damage that deep fakes do to their victims is immense. Let's just stay with our initial example: What would be the consequences for you if your face had been included in a depiction of child abuse?
There are countless victims of deep fake videos, according to studies mostly women who are in the public eye and a thorn in the side of others. The damage to their image is enormous, the psychological consequences are severe. The victims suffer from nightmares, panic attacks, the feeling of being at the mercy of others, they withdraw from the public. The latter is called the "silence effect". A study by Amnesty International cites this as one of the consequences of abuse against women on Twitter. Deep fake videos are therefore also used specifically against women who publicly stand up for a cause. A very prominent example is the case of Rana Ayyub, an Indian journalist who had denounced that the nationalist party BJP defended a child molester. Her opponents began a hate campaign against her, including a deep fake video that appeared to show her having sex. It went viral on social media and the journalist faced so much anger and outrage that she could not appear in public and do her job for months.
Deleting deep fakes? A Sisyphean task
The Internet does not forget, and the chances of finding the originators of deep fakes and holding them accountable are extremely slim. Deleting them is not only time-consuming and costly, but also a Sisyphean task, since the technology available on the market can easily be used to create new deep fakes again and again.
The cyber security firm Sensity estimates that the number of deep fakes is growing exponentially, doubling every six months. Deep fakers are active worldwide and, according to their own data, are mostly male. According to a 2019 report by cyber security firm Deeptrace, 96 per cent of all deep fake videos on the internet were pornographic and the victims were mostly women.
The legal situation
There are currently two new laws designed to hold platforms accountable for the content they host: The EU's "Digital Services Act" and the "Online Harms Bill" in the UK.
In Germany, according to an October 2019 report by Deutschlandfunk Nova, users are protected by the right of personality in the constitution, the Art Copyright Act and the Telemedia and Network Enforcement Act. The former applies to pornography and states that images of a person may only be used with that person's consent, while the latter makes it possible to quickly delete illegal content that is distributed on the internet from the same.
According to a 2020 evaluation by the Konrad Adenauer Foundation, however, there is a need for legal action regarding the handling of deep fakes.
Detection of deep fakes
Detecting deep fakes is extremely difficult for humans, which is why deep fakes are so problematic. While the technology to create deep fakes is improving, experts are simultaneously working on methods to detect deep fakes. DARPA (Defense Advanced Research Projects Agency), a research institution of the US military, is currently very successful with this.
How to protect against deep fakes?
There will be no 100% protection against deep fakes in the near future, as the creators of deep fakes and the experts who work to expose them are in a race against each other. A legal solution does not seem to be in sight either.
However, you have the chance to significantly reduce deep fakes and minimise the risk: Regain control over your data. polypoly helps you to do so. We have a solution that will allow you to delete photos from the internet in the future, easily and automatically, no matter on which part of the world they are stored. Now we are in the process of implementing the necessary technology and we need your help. Become a member of our cooperative and help us to make this solution available as soon as possible to everyone who cares about protecting their privacy and therefore themselves.
More information about the cooperative can be found here