The next phone call you get might be a Deepfake one

77

When deepfakes were originally announced, the level of tech behind them was actually pretty cool. Despite its use in creating fake video footage and whatnot, it also gave us a greater look at what implications AI could have on the real world. Now it appears that the line between fact and fiction is divided by only a hairline.

According to the Wall Street Journal, the chief executive of an energy company in the UK was tricked into wiring around $220,000 to a Hungarian supplier. Why? Because the chief executive thought that they were instructions from his boss. In reality, it was a talented criminal equipped with an AI Deepfake software that mimicked the voice of the executive and demanded payment within the hour. After all, if you got a call from someone who sounded like your boss, you’d freak out too right?

Deepfakes
The CE of an energy company in the UK was tricked into wiring around $220,000 to a Hungarian supplier. (Image Credits: TNW)

Adding to the authenticity of the voice, the software not only imitated the voice, but also imitated the tonality, and even the German accent the executive had. In fact, the phone call was actually matched with an email, adding even more credibility to the Deepfake executive. As for the money, well it’s long gone now. It has been moved through accounts in Hungary and Mexico and spread around the world.

But it didn’t stop there. The criminals made a second request. At this point, the chief executive called up his actual boss and found himself handling calls from both the fake and real versions of his boss. This, in turn, made him aware that there was some skullduggery going on.

Fun fact: This isn’t the first deepfake theft that took place

According to Symantec which is a cybersecurity company, there have been at least three other cases of deepfake voice fraud that were used to hoodwink companies into sending money to a fake account. One such case actually resulted in a loss of a few million dollars. All these go to show that AI research, especially those around the creation of video and audio is a dangerous thing. Yet we use it unknowingly.

For example, at Google I/O, Google announced its Duplex service. This would mimic the voice of an actual human being so that it can make calls on behalf of the user. But it’s not only Google. A number of startups located in China are also offering similar services for free on smartphones.

Deepfake
Google Duplex mimics the voice of an actual human being to interact on behalf of the user (Image Credits: Engadget)

To combat all this, researchers from all over the world are working on trying to develop software that can detect deepfakes. But as technology improves, so do deepfakes. Hence, it becomes increasingly challenging to figure out what is real and what isn’t. In order to combat deepfakes, we’re going to have our work cut out for us. But there are certain measures that you can take. For example, if you do get a call from your boss or a friend asking for cash, you can always call them and ask them a question only the real person would know the answer to. Constant vigilance is the key here.

LEAVE A REPLY

Please enter your comment!
Please enter your name here