Featured on BBC One’s Northern Justice & Morning Live

Search

Blog

Financial Firms Should Use Artificial Intelligence to Stop Scams, According to the FCA

Latest News

The Financial Conduct Authority wants firms to use new technology to beat the fraudsters at their own game.

Independent regulatory body, the Financial Conduct Authority (FCA), has urged financial services firms to look at new technology, specifically artificial intelligence, to help combat scams.

Artificial Intelligence (AI) has been defined as “the simulation of human intelligence processes by machines, especially computer systems”. It is evolving rapidly and changing how most industries operate. AI is older than you might think, with the first successful AI computer program documented back in 1951. Using computers to process vast amounts of data faster than humans means that different aspects of AI have developed, including:

Machine learning: Acquiring data and creating rules (algorithms) to tell computers how to complete a specific task. Over time, software applications can teach themselves and better interpret the information.

Deep learning: A subset of machine learning, using neural networks (based on what we know about the brain’s structure). This category includes self-driving cars and ChatGPT, a natural language processing tool driven by AI technology that allows you to have human-like conversations.

Many businesses have been using artificial intelligence for years, to assist with functions like customer service or quality control. Not everyone likes messaging a website ‘chatbot’ or speaking to a ‘robot’ on the other end of the phone, but AI in customer service is probably here to stay!

In the financial services industry, automation of repetitive tasks, such as analysing customer data and forecasting trends is a common activity, as is leveraging “Big Data”, the vast quantity and variety of information a business holds about each customer, to improve the customer experience.

However, the industry will have to evolve and adapt, just like the technology.

We have previously discussed scams where people receive a text message or phone call claiming to be from their bank or a trusted business. Text message scams (also known as ‘smishing’) may contain a phone number and ask you to call to verify your identity and you end up speaking to the scammer directly. The BBC’s Scam Interceptors programme tries to incept these calls and stop the fraudsters in their tracks.

But a more sophisticated type of scam is on the rise. Emad Aladhal is the director of a financial crime team at the Financial Conduct Authority, the watchdog body that regulates financial institutions and markets in the UK. He recently highlighted on the FCA’s podcast the worrying trend that fraudsters are cloning people’s voices using AI technology.

“One particular example that’s almost scary is that a parent picks up the phone call, thinks it’s their son, but it’s not their son. It’s their voice being spoofed using that technology and made to believe that they’re in trouble and in need of money.”

This is a type of Impersonation Scam where fraudsters pretend to be a family member who you wouldn’t want to see struggling for money, and take advantage of that close relationship. Often, the request comes with a sense of urgency, meaning the usual common-sense checks are more likely to be forgotten and money is transferred without question. In one such case, the voice of a 15-year-old girl was cloned and used to try and extract a kidnap ransom from her extremely distressed mother.

You may have seen in the news recently that Martin Lewis, founder of the popular consumer website, Money Saving Expert, was targeted when a fake video appeared of him endorsing “Elon Musk’s new project”. His face, the way in which he spoke and the words and phrases he used were very convincing, but Mr Lewis is very vocal about NOT doing adverts and NEVER promoting investments.

A common tactic used by fraudsters is to be on the telephone with their victim at the same time as gaining access to their phone or computer, so they can see the screen, login details for online banking, and the questions being asked. They can then coach their victim on how to pass online banking security, making sure that transactions go through easily, without any further checks or delays. This might include giving false information about the purpose of the payment or making a few smaller payments rather than one large one to look less suspicious.

In the same podcast, the FCA’s Mr Aladhal also explained how financial firms are increasingly using AI technology to look at customer behaviour on online banking apps to see if there was any unusual activity and ‘red flags’.

Traditionally, due diligence within banks involved Anti Money Laundering checks, such as confirming a person’s identity, asking about the intended purpose of a payment, or checking the relationship between customer and recipient. Good practice could include asking customers if they have been coached to answer questions in a certain way before the bank would authorise a payment. Any tools to help make a customer stop and think “Could this be a scam?” are helpful, both in preventing financial loss and improving the banks’ security systems.

Mr Aladhal went on to say that firms should be sharing the intelligence they have gained, both with other financial organisations and with the police, so that everyone can learn and “respond to it as a collective”.

Sarah Spruce, Head of TLW Solicitors’ APP Fraud team, says:

“Artificial Intelligence solutions can be powerful, working faster and more accurately than humans. But we’ve also read the terrifying news reports about scammers pretending to be family members who need financial help and can’t help thinking that the number of such cases we’ll see will rise. A collective approach by the banks, online security experts and the police can only be welcomed. This has to be an evolving process – not only keeping up with, but trying to be a step ahead of the scammers.”

Please contact us if you have been the victim of a scam involving Artificial Intelligence or other Authorised Push Payment Fraud.

Call us on 0800 169 5925, email us at info@tlwsolicitors.co.uk or complete one of the online forms below.

Getting advice as soon as possible is important, as strict time limits can apply.

Minimum case values apply.

Meet Our Team

Meet Sarah, who heads up our experienced Authorised Push Payment Fraud Claims team.

Sarah and her colleagues are on hand to help with your claim.

TLW Solicitors pledge to:

  • Always fight your corner.
  • Explain anything you don't understand.
  • Provide full transparency on our charges.
  • Never ask for any upfront payment.
  • Recover the best compensation we can.
  • Keep your personal information safe.
  • Respond quickly to any queries.
Important
Message

Christmas
Opening Times

December 2024

Monday 23rd: Open 9am – 5pm
Tuesday 24th: Open 9am – 1pm
Wednesday 25th – Tuesday 31st: Closed

January 2025

Wednesday 1st: Closed
Thursday 2nd: Open 9am – 5pm