Adobe’s CEO wants to fight the deepfakes he helped create
The Future. Recognizing how the company’s Photoshop and Premiere software is used to create fake media, Adobe CEO Shantanu Narayen highlights new proprietary tools that create and track original photos and videos. Still, the metadata labels may not be enough to fight back against a click-bait culture where photos and videos aren’t scrutinized for veracity… meaning that, as Narayen notes, individuals have an “obligation to seek the truth.”
Shantanu Narayen is deeply sorry about deepfakes, so Adobe is pushing some recent features to combat manipulated photos and videos.
- Three years ago, Adobe launched the Content Authenticity Initiative with just a few media and tech companies on board. It has since grown to 700.
- The goal is to create “provenance” for media — a feature in which “designers and consumers of content can […] create and track a digital trail that shows who is responsible for a given video or image and any changes they made to it.”
The feature helps tackle the “authentication of content,” which, Narayen tells Forbes, is “the most important thing on the Internet now.” (And Narayen would also like to change the perception of “Photoshop” having a negative meaning.)
But, somewhat ironically, Adobe is also hyper-focused on retrofitting all of its tools with AI — a project called “AI First” within the company. That includes the features like automatic sky-replacement, the ability to remove “unwanted” objects from videos, and the ability to change facial expressions or voice pitch.
These are powerful tools for creativity but can also be used to create media with the intention of deception. While it takes a lot of work to create a deepfake as uncanny as the viral Tom Cruise ones or the ones that helped win the South Korean presidential election, Adobe’s General Counsel and Chief Trust Officer, Dana Rao, notes that AI advancements are ushering in a future where it will be difficult to “distinguish fact from fiction, reality from artificial reality.” And unfortunately, people are proving to be very bad at determining what (or who) is real.