top of page

The dangers of voice cloning: A new type of fraud

  • Writer: rrelentless
    rrelentless
  • Jan 27
  • 5 min read

Updated: Feb 26



If you’ve spent any time on social media recently, you will have come across videos where voice cloning has been used to make it sound like celebrities are saying something wholly different from what they originally said. While this is being done mostly for entertainment purposes, voice cloning – which is the process of using AI to replicate someone's voice from even a brief sample - poses significant cyber security risks to businesses. 

 

The danger arises from how easily voice data can be obtained and exploited for fraudulent purposes, identity theft and social engineering attacks.

 

Let’s take a look at some of the risks you should be aware of:  

 

1. Social Engineering and Phishing Attacks 

  • Imagine you get a telephone call from your CEO, senior executive, manager or trusted employee. They ask you to carry out a particular financial transaction or give them a password or security code. You see no problem – you recognise their voice; after all, you speak to them every day – and you do as they ask. But it isn’t them – someone has found a recording of their voice and cloned it, making you think you’re talking to a colleague when you’ve actually been in conversation with a fraudster.  

And they don’t necessarily need to FIND a recording of someone's voice; they could also manually create it via a seemingly innocent call to the company’s customer service line, requesting to speak to a manager. The conversation with the manager is recorded and then sampled for the AI recreation. 

  • For many firms, voice interactions are often used as a trusted form of identity verification (e.g., voice biometrics). In 2017, HMRC started to use voice ID to make managing taxes easier for taxpayers (It was later ordered to delete those voice records after privacy concerns were raised). If an attacker has cloned a customer’s voice, they can then exploit this to bypass security systems that rely on voice ID, such as call centres or authentication services. 


2. Fraudulent Transactions 

  • As we’ve seen above, voice cloning can be used to authorise financial transactions, change account details or request payments under the guise of someone with authority within the company. The result of this could be unauthorised financial transfers. 

  • Cybercriminals could use voice cloning to pretend to be a key decision-maker to influence or manipulate contracts, agreements or negotiations, leading to potential financial or reputational damage. 


3. Loss of Trust and Reputation 

  • If a voice clone is used in a scam that compromises customer information or involves fraudulent actions, customers may start to doubt how robust the company's security measures are, leading to huge damage to the business' reputation, not to mention brand loyalty, which can cause big financial losses. 

  • Employees may start to lose confidence in the security of internal or external communications channels, especially if their own voices can be easily cloned and used maliciously.  


4. Exploitation of Sensitive Information 

  • As we have seen above, cloned voices can be used to access confidential information, such as intellectual property, company secrets or private employee information. If these sensitive resources are leaked or sold on to other cyber criminals, it can lead to significant financial loss or competitive disadvantage. 

  • If attackers can clone voices based on publicly available information (like social media posts, videos or public speeches), or even create it themselves from a seemingly innocent phone call (as mentioned above), they could use personal characteristics or familiar phrases to bypass security protocols. 


5. Difficulty in Detection 

  • As voice cloning technology improves, distinguishing between a real voice and a cloned one becomes increasingly difficult. This makes it harder to implement robust voice verification systems that protect against unauthorised access. 

  • Like many businesses, you may not be ready for the specific threats of voice cloning. Hitherto sufficient methods of verifying identity, such as passwords or PINs, are really not up to the challenges posed by these new forms of attack.  


6. Legal and Regulatory Implications 

  • Businesses in certain sectors and industries have to adhere to strict privacy and data protection regulations, such as the UK GDPR or the Data Protection Act 2018. If a regulated business suffers an attack which uses voice cloning, it may find itself facing fines, penalties, or legal action. 

  • If a voice cloning attack breaches a company’s security, they may be held liable for failing to take adequate precautions to protect the sensitive data they hold. This may be particularly so if the breach leads to financial or reputational harm. 

 

What can be done?  

It would be easy to despair in the face of this new threat, particularly as it is evolving all the time and safeguards put in place now may become outdated very quickly. However, with the right guidance and expert advice from cyber security professionals, there are several safeguards you can put in place and continue to update to ensure your business has the best chance against cyber criminals.   


Multi-factor Authentication: 

It’s become clear that relying on voice ID alone for identity verification is no longer enough. Incorporating additional layers of authentication (such as SMS codes, biometrics or security questions) can help strengthen security measures. 


Voice Activity Detection (VAD): 

There are systems that can analyse the characteristics of a voice and detect signs of synthetic audio. Making use of these advanced VAD systems can help identify cloned voices in a way that the human ear can’t.  


Employee Awareness Training: 

Employees may have heard about voice cloning and perhaps seen videos where it’s been used. But they may not realise the real dangers it can pose. Educating all employees about the risks and showing them how to verify vocal requests or report suspicious activities can help reduce the thread of social engineering.  

 

Regular Security Audits: 

As we mentioned above, the technology behind cyber attacks in general, and voice cloning in particular is evolving at a concerning rate. Therefore, a process of continuous evaluation and updating of security protocols, including voice-based authentication, to ensure they can resist emerging threats will pay dividends. 


Approval Control:

Setting up a hierarchy of approvals before a payment can be made will help to reduce the risk of fraud. These involve multiple approvals from management, depending on the size and complexity of the business. 


Segregation of Duties:

This is where the responsibilities of the payment process are split across several employees. No one person has control over the payment process, so even if one person is taken in by a voice cloning attack, the possibility of unauthorised payments is significantly reduced. 

 

How can we help you?  

If you’re concerned about the threat from voice cloning, cyber insurance is vital to ensure that you are protected proactively and reactively. The good news is that the new cyber insurance policy from rrelentless ticks all your boxes; comprehensive, continually updated and with access to legal and practical advice from a specialist team of Cyber, Data and Information Law solicitors and data practitioners, it’s the security you need when you’re worried about your security. Don’t leave it to chance – take a look at what the rrelentless cyber security policy has to offer - https://www.rrelentless.com/cyber-insurance   

bottom of page