Grant McGregor Blog

Deepfakes: When Familiar Voices and Faces Become a Business Risk

Written by Grant McGregor Team | 03/03/26 17:59

Imagine your CEO calling with an urgent request for a payment. Would your finance team stop to verify that it was really them?

 

What if someone with a familiar voice on Microsoft Teams reported a critical system failure and asked for immediate admin access?

 

Would your team pause or act quickly because the voice sounds exactly like a trusted colleague?

 

For years, cyber security training has focused on suspicious links and phishing emails. That guidance still applies, but impersonation has evolved.

 

The threat is no longer just written text. It can arrive as a voice you recognise, or a face you trust. It feels like part of the workday routine.

 

Defining the threat: what is a deepfake?

A deepfake is technology that uses artificial intelligence to recreate a person’s voice or appearance in audio or video. With only a short clip of someone speaking, criminals can generate convincing content that appears to come from someone you know.

 

Most people instinctively trust what they see and hear.

 

When a request appears to come from someone familiar, our guard naturally lowers.

 

The technology behind this is widely accessible and voice cloning does not require specialist equipment. Highly realistic audio and video can now be created quickly using readily available tools. It does not need studio-level perfection. It only needs to feel believable in the moment.

 

A costly case of mistaken identity

You may remember the deepfake scam involving Martin Lewis. Fraudsters created a realistic video of him endorsing a fake investment scheme.

 

Despite Lewis repeatedly warning that he does not promote such investments, the visual familiarity persuaded people to send money. The video was imperfect, but it felt real enough.

 

In another case, a Ferrari executive received a call that appeared to come from the company’s CEO. The voice matched. The context made sense.

 

The attempted fraud collapsed because the executive asked a simple question about a recent book recommendation. The caller could not answer. That basic verification step stopped what could have been a serious financial loss.

 

Why smaller organisations are exposed

Deepfake attacks are not limited to multinational brands or public figures.

 

Smaller organisations often operate on speed and trust. Informal communication, lean teams and quick decisions are common strengths, but they can also create opportunities for impersonation.

 

When a request sounds like it comes from a senior colleague, people tend to act quickly without questioning it.

 

Access is provided because a request sounds urgent. These attacks target human judgement, rather than technical weaknesses.

 

Rethinking verification

Relying on voice recognition alone is no longer enough. Many organisations are introducing straightforward confirmation steps for high-risk actions.

 

The shift is simple: instinct supported by process.

 

Practical steps for SMEs

 

  • Switch channels:
    If a request for money or sensitive data arrives via one platform, confirm it through another trusted method.

  • Pre-set verification
    Agree internal challenge questions for sensitive requests - details that would not be publicly available.

  • Support the pause:
    Make it clear across the organisation that slowing down to verify is good practice. No one should feel pressured to act instantly on an urgent request.

 

In short

Deepfakes are becoming part of the modern fraud landscape.

 

The greatest risk arises when a request feels urgent and sounds familiar. Verification adds a layer of protection for both people and the business.

 

It comes down to this: can identity be confirmed before action is taken, and has the team been trained to pause and verify?

 

If you would like to review how your organisation handles high-risk requests or remote access approvals, our team is here to help.

 

Call us: 0131 603 7910

Message us: https://www.grantmcgregor.co.uk/contact-us