AI Augmentation: The Melding of Man and Machine to Combat Advanced Cyberthreats

AI Augmentation: The Melding of Man and Machine to Combat Advanced Cyberthreats

Deepfake technology is an advanced cyberthreat that has captured the imagination over the past number of months. Deepfakes have the potential to re-write history (see the fake President Nixon Apollo 11 video below), spread mass disinformation and engage in fraud. How best to mitigate the risk of advanced cyberthreats such as deepfakes?

We have previously covered deepfake technology in these pages (“F for Fake“). Deepfake technology is in its early stages yet is quite effective. Deepfake audio and video files don’t require millions of dollars nor hundreds of engineers to produce. Some examples of deepfake technology are benign (“Ctrl Shift Face“), yet the potential to spread disinformation and to cause political and economic harm is enormous.

Did Your CEO Really Instruct You to Wire Capital? Deepfake Voice Fraud.

Yes, the CEO of an undisclosed energy company was recently the victim of a deepfake voice fraud crime when the CEO of the parent company supposedly instructed the subsidiary CEO to wire capital. The instruction came in the form of a phone call. Only the phone call did not come from the actual parent company CEO but from criminals who used deep learning technology to commit a deepfake crime. They effectively created a voice proxy of the parent company CEO, right down to his slight Hungarian accent to trick his subordinate into wiring capital. The perpetrators were only caught because they tried the same trick on the same CEO shortly thereafter while simultaneously running the same scam on the parent company CEO. The CEOs each suspected something suspicious and through a bit of triangulation the criminals were caught (read more here).

Deepfakes to Perpetrate Propaganda

Wire fraud is one thing, the potential to rewrite history is another. We have evolved well beyond photoshopped photos. The video below was created by MIT to demonstrate deepfake video technology. It was created using a voice actor and the text from an actual speech that was written for President Nixon but never actually given. (read more here).


MIT deepfake video – President Nixon Apollo 11

How Best to Combat Deepfakes?

So, how best to combat deepfakes? The technology will only become more sophisticated and difficult to detect. Facebook and Google are taking steps to detect and remove deepfake content from their respective platforms. What else can be done? If you are like me you question the authenticity of most every piece of online content – has this image been altered? Is this front line news footage authentic? One option is to regress – to move all meaningful interactions – whether they be personnel interaction or financial transactions offline. That in-person approach certainly makes sense for some interactions. However, to move everything offline isn’t practical and the thought of doing so is more than a bit depressing – it’s admitting defeat. A better approach is to use technology to fight deepfakes. Admittedly this can and likely will become a cat-and-mouse game as do all technology battles between good people and nefarious actors.

Deepfakes and similar advanced cyberthreats are coming at us so fast and frequently that legacy approaches to cyberthreats won’t work. What is required is a new, personalized, real-time 24x7x365 approach to cybersecurity and authentication. This personalized approach may require us to create virtual avatars that contain unique PII data elements that only we as individuals and authenticated third parties have access to. Think of it as a Virtual Reality form-fitting wrapper, a personalized digital cloak. One that facilitates our interactions with the world, authenticating and transacting in real-time while being transparent to the user. Perhaps the melding of man and machine – the Singularity – is closer than we think.


Deepfake audio reproduction of Joe Rogan’s voice.


Chinese face-swapping app “Zao”

Ctrl Shift Face – Jim Carrey’s face used in place of Jack Nicholson’s for deepfake video clip of “The Shining”