AI voice cloning is used in a massive heist being investigated by investigators in Dubai, amid warnings about cybercriminal use of new technology.
In early 2020, a bank manager in Hong Kong received a call from a man whose voice he recognized, a manager of a company he had previously spoken to. The manager had good news: his company was on the verge of an acquisition, so he needed the bank to authorize transfers to the tune of $ 35 million. A lawyer named Martin Zelner had been hired to coordinate the proceedings and the bank manager could see emails from the manager and Zelner in his inbox, confirming the money needed to move where. The bank manager, believing that everything seemed legitimate, began to make the transfers.
What he didn’t know was that he had been duped in an elaborate scam, in which fraudsters used “deep voice” technology to clone the director’s speech, according to a document. judicial discovered by Forbes in which the UAE enlisted the help of US investigators in locating $ 400,000 in stolen funds that went into US-based accounts held by Centennial Bank. The United Arab Emirates, which is investigating the heist as it affected entities inside the country, believes it was an elaborate ploy, involving at least 17 people, that sent the stolen money to bank accounts around the world.
Few more details were given in the document as none of the names of the victims were provided. The Dubai prosecutor’s office, which is leading the investigation, had not responded to requests for comment at the time of publication. Martin Zelner, a US-based lawyer, had also been contacted for comment, but had not responded at the time of publication.
This is only the second known case of fraudsters who allegedly used voice-shaping tools to perform a break-in, but it appears to have been far more successful than the first, in which fraudsters used the technology to do so. go from being the CEO of a UK-based energy company to an attempted theft of $ 240,000 in 2019, according to the the Wall Street newspaper.
The UAE case shows how devastating such high-tech scams can be and lands amid warnings about the use of AI to create allegedly false and deep images and voices in cybercrime.
“Audio and visual forgeries represent the fascinating development of 21st century technology, but they are also potentially incredibly dangerous and pose a huge threat to data, money and businesses,” says Jake Moore, a former police officer from Dorset Police Department in the UK and now cybersecurity expert at security firm ESET. “We are currently on the verge of seeing malicious actors shift their expertise and resources to using the latest technology to manipulate people who are innocently unaware of the realms of deep fake technology and even their existence.
“The manipulation of audio, which is easier to orchestrate than creating fake deep videos, will only increase in volume and without education and awareness of this new type of attack vector, as well as better authentication methods, more businesses are likely to fall victim to very convincing conversations.
Formerly a technology confined to the domain of fictitious antics like Impossible mission, voice cloning is now widely available. Various tech startups are working on increasingly sophisticated AI voice technologies, from Aflorithmic of London to Respeecher of Ukraine and Resemble.AI of Canada. Technology has caused a sensation in recent months with the revelation that the late Anthony Bourdain had his voice synthesized for a documentary on his life. Meanwhile, recognizing the potential for malicious use of AI, a handful of companies, like the $ 900 million security firm Pindrop, now claim to be able to detect synthesized voices and thus prevent fraud.
If recordings of your conversation are available online, whether on social media, YouTube, or on an employer’s website, it may well be that a secret battle is unfolding for control of your voice without your knowledge. .
UPDATE: After the publication, the UAE Department of Foreign Affairs and International Cooperation contacted Forbes to note that the bank concerned was in Hong Kong, not the UAE, although investigators in Dubai were leading the investigation. . The article was updated on October 22, 2022 to reflect this.
In a statement, HE Hamid Al Zaabi, Director General of the United Arab Emirates Executive Office for Combating Money Laundering and Terrorist Financing, added: “Even in the event of incidents occurring outside the Arab Emirates united, we will work closely with and detect individuals who knowingly engage in deceptive practices such as impostor fraud. The UAE will then prosecute these individuals to the fullest extent of the law, ensuring that they are held accountable and brought to justice expeditiously. “