The Next Step to the Fake News Nightmare


The world has advanced a lot since the beginning of our species. We are now able to walk on the moon and have discovered that there is so much more than just our planet. But, as with every advancement, we have also made mistakes. Artificial intelligence can be classified as one of these, and when used incorrectly, it is dangerous.

Deepfakes are synthetic media in which a person in an existing image or video is replaced with someone else's likeness. While the act of creating fake content is not new, deepfakes leverage powerful techniques from machine learning and artificial intelligence to generate visual and audio content that can more easily deceive. The main machine learning methods used to create deepfakes are based on deep learning and involve training generative neural network architectures, such as autoencoders, or generative adversarial networks.

In a few short years, deepfakes have become the most talked-about technology on the internet. The spread of this technology has led to a number of concerns, including its potential use in creating celebrity pornographic videos, revenge porn, fake news and more.

There are many examples of deepfakes being used in the public sphere. In January, for example, two Reddit users were able to create an image of former president Donald Trump with a Hitler-like moustache. Prompted by the incident, Reddit posted a statement warning users that deepfakes are not safe and should not be used to alter photos or videos.

As we can clearly see, this will be a recurring incident for many years to come.

In the end, deepfakes raise several important questions regarding machine learning, artificial intelligence, and human society. The potential for harm is present in both its use for prurient methods and for efforts to market fake news. Governments will likely regulate to control the former and maybe attempt to restrict the latter as well. For example, Google DeepMind has pledged to not develop military or strategic artificial intelligence applications.

Comments

Popular Posts