“Millions” of people could fall victim to scams using artificial intelligence to clone their voices, a UK bank has warned. Starling Bank, an online-only lender, said fraudsters are capable of using AI to replicate a person’s voice from just three seconds of audio found in, for example, a video the person has posted online. Scammers can then identify the person’s friends and family members and use the AI-cloned voice to stage a phone call to ask for money.

These types of scams have the potential to “catch millions out,” Starling Bank said on Wednesday. They have already affected hundreds of people. According to a survey of more than 3,000 adults, which the bank conducted with Mortar Research last month, more than a quarter of respondents said they have been targeted by an AI voice-cloning scam in the past 12 months.

The survey also showed 46 per cent of respondents were not aware that such scams existed, and 8 per cent would send as much money as requested by a friend or family member even if they thought the call seemed strange. “People regularly post content online which has recordings of their voice without ever imagining it’s making them more vulnerable to fraudsters,” Starling Bank chief information security officer Lisa Grahame said. The bank is encouraging people to set a “safe phrase” with their loved ones — a simple, random phrase that’s easy to remember and different from their other passwords — which can be used to verify their identity o.