A lot of TikTok videos have been going around lately where people tell stories about some really scary.
One user spoke about getting a phone call from her daughter’s number, asking her to come and open the door because she was outside. The mother, confused, asked why she’d be outside if she had jusr dropped her off at school and upon realising, she quickly asked the “daughter”: “what school do you go to?” The other person on the line couldn’t answer the question, and this raised suspicion.
Another user screen recorded a Zoom meeting where a man sat in an office, claiming to be a certain corporate worker. He had an office set up with a certificate hung up with the “name” of the worker, who seemed to have a high position. The user was not convinced and so he asked the man to wave his hand over his face.
The man began to panic and get defensive, claiming that the man was “taking it too far”. He then proceeded to hang up. Essentially, what’s happening here, is that the man was using a deepfake filter, disguised as the worker. The man intuitively thought to get him to wave over his face so as to distort the filter.
These are only two of so many instances where phone calls and everyday interactions online are being used as ways to scam or even endanger others. Scammers are now playing on a very sensitive part of our lives: the people we love and trust. These scammers prey on the fact that we’d do anything for our loved ones or for a job or for survival, even.
Can AI Voice Calls Really Sound Like Someone Known?
So these AI phone scams now copy voices using short clips taken from messages or social media. Mobile provider O2 reports blocking over 50 million scam calls each month, showing how often these calls reach phones across the UK.
Tech CEO Naveed Janmohamed from AI research assistant Anara explains how quickly the technology has changed. He says, “Today’s AI needs just seconds of your voice to create convincing clones, letting criminals pose as your bank or tax office without setting off alarm bells.”
Janmohamed says, “The aim is to get hold of personal details, bank information, or straight-up cash – and they’re using urgency tactics to make victims act fast.”
What Gives Away An AI Voice Scam?
AI calls can sound familiar, but there’s behaviours that often gives them away during conversation. Janmohamed says that replies sometimes pause in unusual ways after questions are asked.
He also describes background sound changing without reason, sometimes cutting out or switching to odd static during the call. He adds that unexpected questions can lead to vague replies or delays that feel unusual in real conversations.
More from News
- FinanceWire And Symex Global Partner To Boost PR And IR Reach For Euronext Paris Companies
- Do People Trust AI More Than They Trust Humans?
- Power Costs Are Causing 1 In 5 UK Firms To Move Overseas
- What Will Happen If EU Regulators Win At Getting Google To Share Its Data?
- Uber Eats Makes Influencers Central To Its UK Growth Strategy
- It Sounds Ridiculous, So Why Is Allbirds’ AI Pivot Actually Working?
- Tanzania Is Dealing With Digital Fraud Through Legislation – What Are The Changes?
- UK Government To Launch £500 Million Sovereign AI Unit – What Does This Mean?
If a caller asks for bank details or money transfers, that request often screams risk. Official organisations do not expect sensitive information to be shared during unsolicited calls.
Ending the call and using an official contact number taken from a bank card or website reduces exposure to fraud attempts.
Why Are AI Scams Becoming More Common?
Scam calls already affect many households across the UK. Actually, National Trading Standards reports that 73% of adults were targeted by scam calls last year, while 19 million people reported losing money.
Only 1 in 3 victims reports these incidents to authorities. Many people stay silent due to embarrassment or belief that reporting will not change outcomes.
Romance scams are also starting to grow with Barclays scam data showing that victims lost an average of £7,000 in 2025, while 66% of UK adults say AI tools make online dating scams harder to detect. Barclays also reports that 53% feel uneasy about voice or image impersonation.
How Can People Reduce Risk At Home?
Families reduce risk through simple habits around phone use and online contact. Janmohamed says, “If something sounds off about a call discussing your money or asking for personal details, it almost certainly is,” and adds, “Hang up immediately.”
He also advises calling organisations back using numbers found on official cards or websites instead of relying on incoming calls. Calls that demand fast action or secrecy often are a red flag and it likely is one of those fraud attempts.
Barclays scam data shows many people want more action from technology companies, with 84% calling for better prevention of scams at source.
Families should start using shared passwords or safe words known only within close circles to confirm identity during unexpected calls claiming to be from relatives.