AI Voice Cloning Scams Target Company Executives

Scammers Use AI to Clone Executives’ Voices and Defraud Victims

The number of scammers using voice cloning tools is rapidly increasing. Criminals have access to a wide range of materials to train artificial intelligence algorithms, including audio clips, podcasts, and online presentations. According to Fortune, just 45 minutes of audio is enough for the technology to learn how to imitate a person’s voice.

Experts from cybersecurity firm Secure Anchor report that in the past four months alone, the number of such incidents has risen by 60%. Seventeen companies lost an average of $175,000 each due to voice cloning scams, and in one case, hackers gained access to a company’s IT systems.

How to Protect Against Voice Cloning Scams

One of the best ways to combat these types of scams is to develop protocols and rules that define the conditions for transferring funds and set limits on transaction amounts. For example, after receiving a call requesting money, passwords, or other sensitive information, employees should confirm these requests by calling the caller back using a different phone and the official phone number.

Executives can also consider using rotating code words for financial transactions. While scammers may be able to clone a voice, they are unlikely to answer specific security questions correctly.

Stay Vigilant

As voice cloning technology becomes more accessible, it’s crucial for companies to educate their staff about these risks and implement strict verification procedures for sensitive requests. Taking these steps can help prevent significant financial losses and protect company data from cybercriminals.

Leave a Reply