Deep Fakes are becoming an increasingly large problem. If you’re not familiar with the term, a Deep Fake is an advanced photo or video technique in which an existing person in a video is digitally overwritten by some other person.
Want to make it look like a wholesome actress has a porn video on the web? A Deep Fake will get the job done.
Want to make it look like a politician you can’t stand said something completely hypocritical? Have a Deep Fake video commissioned to make it look like the Politician in question said whatever you want him or her to say.
Or, on the financial front, if you want to tank a rival company’s stock price, you can make a Deep Fake video of the company’s CEO reporting an obviously disastrous course of action for his or her company.
These are just a few of the ways Deep Fakes are being used in the here and now, and the underground industry that’s producing this content is still in its infancy. So in the months and years ahead, we can expect to see much more of this type of thing, and in increasingly advanced forms.
Worst of all, people tend to believe the evidence in front of their eyes, so once a Deep Fake image or video starts making the rounds, it can spread like wildfire and quickly be accepted as the truth. After all, what could be more damning than actual video footage of a given event?
Except, of course, that Deep Fake specialists make a trade out of inventing fictions out of thin air and then building videos to support whatever story they want to push. It’s incredibly dangerous, and there are several companies working hard to come up with ways to spot Deep Fakes, Adobe included.
Recently, the company rolled out a new content attribution tool in Photoshop that’s been specifically designed to spot and combat Deep Fakes and the damage they can do. While it’s still in beta, it represents one of the first tangible steps big tech companies are beginning to take to fight a war they didn’t even know they were embroiled in. Kudos to Adobe for their work to this point.