featured-image

INTERNATIONAL BUSINESS TIMES uk NOTICEBOARD My account Log Out Sep 19, 8:40 AM BST Business Latest News UK News Personal Finance Tech and Science Work Life Women in Business Travel Real Estate Reviews NOTICEBOARD My account Log Out UK Edition Business Latest News UK News Personal Finance Tech and Science - Science - Cybersecurity - eSports - Artificial Intelligence Work Life - Retail - Hospitality - Healthcare - Motoring - Finance & Banking - Sports Betting Women in Business Travel Real Estate Reviews Editions Australia Edition India Edition International Edition Singapore Edition United Kingdom United States NOTICEBOARD Follow Us Editions Australia Edition India Edition International Edition Singapore Edition United Kingdom United States Technology CyberSecurity UK Bank Reveals 28% Of Adults Have Fallen Victim To AI Voice Scam: 'It Can Clone Your Voice In 3 Seconds And Empty Out Your Bank Account' Criminals are using AI to replicate voices and trick people into giving away money or personal information By Vinay Patel @VinayPBPatel Published 19 September 2024, 8:40 AM BST Share on Facebook Share on Twitter Share on LinkedIn Share on Reddit Share on Flipboard Share on Pocket AI-powered voice cloning scams are becoming increasingly sophisticated. Criminals are using this technology to impersonate individuals and deceive victims into sharing sensitive information or transferring money. Starling Bank is urging people to be vigilant and take precautions to protect themselves against these threats.

Pexels Starling Bank, a UK bank, has issued a warning regarding a recent surge in scams that employ artificial intelligence (AI) to mimic individuals' voices. The British bank is cautioning the global community about the emergence of AI voice cloning scams. According to a press release, the bank is currently handling hundreds of such cases, and these fraudulent activities could potentially target individuals with social media accounts.



Based on recent data released by Starling Bank: 28 percent of UK adults believe they have been the target of an AI voice cloning scam within the past year. However, nearly half (46 percent) of UK adults are unaware of the existence of this type of scam. Only 30 percent of UK adults are familiar with the red flags to watch out for if they become the target of a voice cloning scam.

The Rise Of AI Voice Cloning Scams The same data suggests that criminals can now replicate a person's voice using as little as three seconds of audio. To raise awareness about AI voice cloning scams, Starling Bank has launched the 'Safe Phrases' campaign in conjunction with the government's Stop! Think Fraud campaign. "People regularly post content online, which has recordings of their voice, without ever imagining it's making them more vulnerable to fraudsters," said Lisa Grahame, chief information security officer at Starling Bank, in the press release.

Starling Bank's Safe Phrases campaign suggests that individuals establish a unique "Safe Phrase" known only to their close friends and family. This phrase can be used to verify the authenticity of their identity during conversations. Suppose someone is contacted by an individual claiming to be a friend or family member but unfamiliar with the agreed-upon Safe Phrase.

In that case, they should immediately suspect that it might be a fraudulent attempt. A reported case from Arizona, US, last year involved a woman who claimed that scammers utilised AI to replicate her 15-year-old daughter's voice and demanded a $1 million ransom. This situation could have been partially prevented if they had established a Safe Phrase.

Financial fraud offences in England and Wales are on the rise due to the increasingly sophisticated techniques criminals employ to extort money. According to UK Finance, these offences increased by 46 percent last year. The Dangers Of Voice Cloning For Personal And Financial Security Last year, it was reported that fraudsters were creating fraudulent job advertisements specifically targeting UK job seekers.

These individuals were then deceived into selling counterfeit products online, with the fraudsters ultimately absconding with the profits. Additionally, Starling's research revealed that the average UK adult has been targeted by a fraud scam five times in the past twelve months. "People regularly post content online which has recordings of their voice, without ever imagining it's making them more vulnerable to fraudsters," said Lisa Grahame, Chief Information Security Officer at Starling Bank.

"Scammers only need three seconds of audio to clone your voice, but it would only take a few minutes with your family and friends to create a Safe Phrase to thwart them. So it's more important than ever for people to be aware of these types of scams being perpetuated by fraudsters and how to protect themselves and their loved ones from falling victim," the top executive added. She said, "We hope that through campaigns such as this, we can arm the public with the information they need to keep themselves safe.

Simply having a Safe Phrase in place with trusted friends and family - which you never share digitally - is a quick and easy way to ensure you can verify who is on the other end of the phone." To initiate the campaign, Starling Bank has enlisted the participation of renowned actor James Nesbitt. His voice has been cloned using AI technology, underscoring the ease with which anyone could potentially fall victim to such scams.

Read more UK Landlord Catches 'Influencer' Tenant Renting Out Her London Home For £190/Night To Fund Luxury Lifestyle From CEO To Inmate: How A $47M Crypto Scam Brought Down A Kansas Bank And Its Leader US Man Charged In Historic AI Music Fraud Case: Used Thousands Of Bots To Stream Fake Songs, Raked In $10M In Royalties Artificial Intelligence Cybercrime © Copyright IBTimes 2024. All rights reserved. Join the Discussion MOST READ IN Technology 1 Want Guac? Chipotle's New 'Autocado' Robots Can Cut, Core And Peel An Avocado In 26 Seconds 2 'Our Chatbots Perform The Tasks Of 700 People': Buy Now, Pay Later Company Klarna To Axe 2,000 Jobs As AI Takes On More Roles 3 Cybertruck Fail: Guy Ditches $100k Tesla Pickup For The Cadillac Lyriq, Here's Why 4 First Tesla Cybertruck Crash Kills Driver In Baytown Area, Electric Pick-up Bursts Into Flames 5 Tesla Cybertruck Fail: Fan Frustrated By Repeated Breakdowns Of Not Just One But Two Vehicles! NEWS World Politics Business Markets Fintech Technology Personal Finance Health Life Style Featured E-retail Electrification of Auto Artificial intelligence Women in business Cybersecurity Real estate SME ABOUT About Us Contact Us Terms & Conditions Privacy Policy Cookie Policy EDITIONS Australia India International Singapore United Kingdom United States FOLLOW US Facebook Twitter LinkedIn Flipboard YouTube TikTok Instagram Threads WhatsApp RSS © Copyright 2024 IBTimes LLC.

All Rights Reserved..

Back to Luxury Page