Mishcon de Reya page structure
Site header
Menu
Main content section

Disputes Nightmares: What would you do if your Board was subjected to a deepfake fraud scam?

Posted on 13 February 2025

John Sendama, Partner

Mishcon de Reya

Good afternoon, we’re just going to give people a moment or two to sign in and then we will get started.

Good afternoon, everybody.  Before we begin, I am going to hand over to our Managing Partner, James Libson to say a few words by way of introduction.  James. 

James Libson

Mishcon de Reya

Thank you, John and welcome to our latest Dispute Nightmares session.  I’m really delighted that so many of you could join and with that, I will pass it back to John. 

John Sendama, Partner

Mishcon de Reya

Thanks very much, James.  That wasn’t in fact our Managing Partner, James Libson, that was a deepfake based on a photograph and James, if you could please reveal yourself.  Thanks very much.  And as we’ve all seen in one form or another, deepfakes involve photos, videos or audio clips that seem real but are in fact generated by artificial intelligence.  In research conducted by Deloitte last year, 26% of the executives surveyed said they’d experienced one or more deepfake incidence in the preceding twelve months alone.  Deloitte’s Research Division estimates that GenAI could enable fraud losses to reach $40 billion by 2027 in the US and that’s up from $12.3 billion in 2023.  With the spread of affordable GenAI and machine learning tools and a massive investment into the development of new AI models, this is a clear and present danger for the business world.  Business leaders were polled on what they saw as the biggest organisational risks from this threat and they said loss of trust from business stakeholders in first place, including employees, investors and vendors, compromised proprietary data, financial loss and reputational damage.  Last year there were a number of high profile and high value examples of deepfake fraud and these included a UK engineering company that had one of its Hong Kong employees duped into transferring 20 million into the hands of fraudsters.  That employee was asked to join a video conference with several participants – not unlike this one – the other people on the call were all actually digital cloned versions of senior officers of the company.  This wasn’t a cyber attack in the traditional sense as none of the systems were compromised, it was actually a technology enhanced social engineering, essentially using tech to circumvent the security barriers that businesses have put in place over many years of training, communications and company policies. 

My name is John Sendama.  I’m a Disputes Lawyer here at Mishcon de Reya and I specialise in fraud investigations, data theft and asset recovery.  To help me explore the risks and practical tools to combat deepfake fraud, I’m joined this afternoon by my Partner, Joe Hancock, who heads up our Cyber Risk and Complex Investigations Practice.  He leads a team of non-lawyer specialists working on cyber security, incident response, resilience, investigations and asset tracing.  I’m also joined by Paul Boskma of Mitek Systems.  They specialise in cutting edge software in the digital identity fraud detection and biometric authentication space.  If you have any questions during the session, please put them in the Q&A box at the bottom and we’ll try our best to address them.  If you’ve got any technical problems, please put those in the Chat box. 

So, starting off, Joe, this type of fraud is built on traditional social engineering.  What are some of the red flags that people should look out for in real time and in your view, do they differ at all in a deepfake fraud context?

Joe Hancock, Partner (non-lawyer)

Mishcon de Reya

Thanks John, good question.  And I, I’ve always had a saying I think for the last ten years that everything we see to a certain degree is kind of like a new crime in an old way, as you said these things deeply based in confidence tricks in that you find them in the kind of be it in the money laundering kind of traditional fraud world or in a kind of a new deepfake world.  Ignoring for a minute kind of some of the kind of more kind of technical aspects of it and you know the idea that we believe what’s in front of us, you often see that the usual kind of signs, attackers looking to exploit hierarchy so, pretending to be someone more senior in an organisation, instructing someone more junior, so that’s like a positional hierarchy point there, you know often where it tends to be an organisation especially when it comes to making payments or to carrying actions, you know people naturally are used to being kind of instructed in that way you know however you kind of put it and that varies from, from culture to culture depending on how that hierarchal power is kind of plays itself out.  You’ll also see the classic examples of urgency and then confidentiality so, confidentiality first, you know “don’t tell anyone about this, it’s very special project, only you know about it” and that also has the benefit of the recipient then feeling like they’re in a position of trust, they’re brought into something, they’re engaged, it also separates them from talking to other people who might say, “Really?  Are you sure the CFO wants you to pay £10 million over there?”.  It keeps them away from that and then the urgency, “this is really critical, we’ve got to do this now, I’m trusting you to be able to do it, don’t tell anyone else” and you can see how this stuff starts to play out and we’ve, we’ve seen that without the deepfake component many times, we’ve seen it work via email, we’ve seen it work via kind of you know phone calls without voice cloning in the past but now you have the added benefit of your eyes are seeing the person, you’re perhaps getting messages that appear for be to them, you’re hearing what potentially sounds to be like their kind of voice on the phone, but again it’s, it’s those old school classic confidence tricks, make it urgent, make it secret and exploit that kind of position where someone doesn’t want to get into trouble and ultimately that’s how the fraudsters will try and influence you to do something which you know if you thought about it, you really wouldn’t want to be doing. 

John Sendama, Partner

Mishcon de Reya

And Paul, from a technical perspective, what’s your view on the current state of deepfake net technology that’s being used in fraud and how easy is it for people to detect this type of fraud in, in real time?

Paul Boskma, Senior Director

Mitek Systems

Yeah, I think we see definitely a trend at the moment that it’s very easy to use software online which you can find online which is easy to consume to create a deepfake and to use a deepfake for example in a video conference like this.  And, and let’s say one or two years ago, it was still very hard to find this technology, it was difficult to have the correct hardware to render all your laptop for example to do a real time face morph where you’re replacing one face with somebody else’s face but today it’s easy accessible, there are online tools you can just buy for $10 per month and it contains plug-ins where it’s very easy to adopt in such a session like this where you can replace the face with somebody else’s face and the problem with the current state of the deepfakes is that for a human being it’s almost impossible to detect whether you’re talking with a real person, with your colleague when we started the meeting with James, or it’s somebody else, it’s a face morph or a deepfake you are having a discussion with and maybe my colleague, John who is also here in the call can show us a little bit about how easy it is just to capture somebody’s face, the only thing what we need is here, you just need some, a portrait of somebody’s face, maybe even a snipping tool and catching your face during a session like this, and in real time you can replace the face with somebody else’s face.  So, looking at this, these faces, you see it’s impossible to tell whether it’s a real person, live, talking with you or it’s somebody else’s face you’re having a conversation with.  Even when the sun is shining from the side window, you will have the glare on his face from the sun and that’s why I think technology needs to be used, needs to be embedded to make sure that you can detect deepfakes in such a session. 

John Sendama, Partner

Mishcon de Reya

That’s, and moving on to a scenario in which you know the worst has happened, Joe, from your experience of incident response work, could you talk to us about the importance of first responders in an organisation and what they would do in an incident such as this.

Joe Hancock, Partner (non-lawyer)

Mishcon de Reya

Yeah, absolutely, thanks John.  I mean there’s two kind of threads to it, there’s kind of a cyber incident response angle to it, if you want to call it that, accepting that to a certain degree all frauds are kind of digital frauds and all frauds are cyber frauds, but also that’s kind of like a fraud incident response angle to it and I’ll defer to you to talk to some of the legal options because I’d just make an absolute hash of it.  You really have when it comes to kind of anything involving kind of financial fraud, like a golden 24 hours to really get on top of what has occurred.  There’s only so fast that funds can move for the fiat banking system, there is a wealth of evidence that needs kind of preserving and if you take rapid and effective action in that first 24 hours it puts you in the best place.  Rapid because we’ve got a time limit and effective because you need to do the right thing, as I say, you know, preserve evidence, stop funds moving, come up with a good plan to deal with the incident.  So that, that, you know, if you can achieve two out of those three things, you’re in a good place.  The cyber side of things is capturing that digital evidence, trying to work out what’s occurred, making sure you’re in the best position to investigate.  That doesn’t necessarily mean we’re going to capture all this evidence with a view to kind of like a criminal prosecution or those kind of things, it’s just making sure that you’re not losing anything as kind of time passes, as things age away and then on the kind of finance side of things, it’s acting really quickly, you know again if it’s a traditional banking payment, making sure that you have dealt with your bank and the recipient bank, making sure that you know you’ve alerted them to the proceeds of crime emerging and that you’ve engaged that kind of specialist advice.  So, yeah, in summary, those to me are the two kind of key things.  24 hours, you’ve got a fantastic chance then of recovering action and it’s all about kind of preserving evidence and making sure your rapid and effective, so I said two things and there’s three, get extra one.

John Sendama, Partner

Mishcon de Reya

You know, that all makes sense, I mean from my perspective as a, an asset recovery specialist lawyer, you tend to see the types of teams that we deal with, first responder teams on the business side involve IT teams, legal teams, communication teams and obviously the key senior decision makers, you know you have what, the absolute gold standard in relation to those teams will be a situation in which those first responders have rehearsed their responses to cyber incidents and obviously that’s, that’s more powerful as a, as a response than having a dusty policy sitting in a drawer somewhere with people’s names on it.  I think the key points are people to know what their role in the response is, how to perform it and they also need to have some understanding of the tools that are available to the team and why they’re, the golden 24 hour period you mentioned is so critical.  From my perspective, when I’m in hot pursuit of stolen funds, it’s actually possible to, to get a, you know a very unhappy judge out of bed and get a Court Order over the phone to free stolen assets that have landed in a bank account belonging to fraudsters and obviously the quicker that’s done, the more likely the money is still going to be there because often we’ll see funds hit the fraudster’s first account and then very quickly after that be transferred to onward accounts and often outside of, outside of the country.  As you mentioned Joe, there are a variety of legal tools that are available to deal with that type of fraud, whether that’s worldwide freezing injunctions that will freeze assets in the bank account or third party disclosure orders and that’s orders against organisations like banks or internet service providers to force them to provide details such as account opening information, any kind of passports and address information, bank statements, those kinds of details that, that you know I’m able to feed in to the work investigators such as your team to be able to build out an asset recovery strategy and move quickly to recover any funds that have been paid out. 

Paul, just a quick question from your perspective, obviously running in parallel with, with any recovery steps that are being undertaken, what are your thoughts on the importance of preserving and collecting evidence in the event of an incident?

Paul Boskma, Senior Director

Mitek Systems

Yeah, I think from a technical perspective it’s very important that, that you use a component which in real time can tell you when you have a conversation with somebody, is this a real person, yes or no?  So that means that you need to implement an AI model, my technology provider can deliver these models, it’s a plug-in for example to a Zoom session like we are working with today and we’re doing this session in real time, we will analyse frames of the person you are having the conversation with and tell you in real time in the screen is this a real person, yes or no.  So of course these, this technology is supported by AI models we have built and for us it’s important that we understand okay, which are, which frames did we collect, what were the scores for these, these frames and if there’s a scam you are involved in, make sure that we store this information and that we can prove that this is a deepfake, this was not a real image.

John Sendama, Partner

Mishcon de Reya

Thanks.  And in terms of preventing this type of fraud happening obviously, this is you know this can be catastrophic for a business for the reasons we’ve discussed previously, what steps Joe in your view are the best way to mitigate against the risk of an attack like this happening in the first place?

Joe Hancock, Partner (non-lawyer)

Mishcon de Reya

Yeah, I mean, there’s two sides to it.  One comment just to make first, I think that what’s always really interesting about you know where this world is developing, we don’t see deepfakes all the time at the moment, okay, this isn’t something that’s happening day in, day out but it’s as it gets easier to consume these services, the computing power required to make deepfakes gets less, you know we’ve seen how easy it is to do today but as quality goes up, you’re going to see more and we’re in a position now where as an investigator you used to be able to say okay if you’ve seen something, it is what it is, you really worked in the worlds of kind of like you know things being faked.  So two things I think, first of all is there’s a really boring answer to this which is like you just need to do the basics well and that’s been my experience across most things in cyber security and most things in fraud is that, you know, deepfake or not, if it is not possible for someone to make a payment from an organisation’s bank account without involving somebody else, you will always have two people involved.  I’ve seen financial controls bypassed, I’ve seen you know all sorts of things happen but ultimately you know you can, you can engineer around some of this stuff, if you have a basic level of awareness of fraud, if you encourage people to speak up, that’s very important and again these are the basics, don’t get me wrong, they are hard, with scale complexity and they’re really hard when it comes to kind of you know, we incentivise people in their jobs to click links, open attachments, answer phone calls, make payments, and now we’re going to say to people hey, you need to be careful, clicking links, opening attachments, believing what’s in front of you is it really your kind of, you know your financial controller or whoever.  So to me one of the real key things though is having a kind of blame free reporting mechanism, you need to have a mechanism by which people can say both kind of like you know centrally, you know, report it to security at, is this real, I’m a bit suspicious about it, but also where people feel that there is permission to go to someone who may be in a more senior position, or perceived or not in the hierarchy of the organisation say, maybe via a different channel, hey can I just confirm this with you and that at worst, they’re going to feel themselves maybe a little bit silly when somebody says yes, of course it’s me, what are you on about but, but there will be no blame for that and then in the ideal world they’re held up as an exemplar of what you should do, you absolutely need that.  The second point though to create that kind of suspicion is you know you get these kind of feelings about things, it is, it’s a level of training and awareness and I think the days of like training everyone that all this bad stuff’s out there is kind of behind us, you know we’ve all been on those courses where you click next repeatedly to get through your mandatory yearly bad stuff happens training, but so focussing training around this kind of issue on the teams that are really going to be impacted which tends to be those that control finances, those that control kind of the access or security and those that are kind of privileged positions so, you know taking a risk based approach to that training and providing real life examples and things that you hope will just get that kind of 18.36 sense tingling and then that reporting mechanism by which someone could say hang on a minute, that’s a bit weird, I’ll just go and ask John, if John really meant to tell me to pay that x amount of money over there.  To me it’s kind of like I always bang on about this stuff, worldclass boring basics and then a bit of awareness and a bit of reporting gets you a long way I think. 

John Sendama, Partner

Mishcon de Reya

And Paul from your perspective I know you mentioned earlier having real time detection technology in place is going to be key, what is your view on the current adoption rates by businesses of this type of technology at the moment?

Paul Boskma, Senior Director

Mitek Systems

Yeah, I think in, in an online identity verification programme where people are onboarding for example for opening up a new bank account, this technology is embedded and we are real time checking during such approaches, is this a real person is this for example a deepfake trying to onboard for a new bank account?  Although this technology is not really available or implemented yet for, for example these Zoom calls, while the technology is the same of course, it’s the same camera which is being consumed, it’s the same frames which are collected and the models which can be used for detecting the deepfakes and to alert the user using this call so, watch out, maybe you’re involved in something you need to take a little bit more time to, before you start doing a payment.  The technology is now available, so Mitek can deliver plug-ins for Zoom for example, where it can do in real time deepfake detection based on what we do today. 

John Sendama,  Partner

Mishcon de Reya

Thanks, and just moving to some of the questions in our Q&A with our remaining time.  One of the questions that’s been asked is can we give examples of what action should be taken within the first 24 hours?  Joe, would you be able to give a few thoughts on what specific steps should be taken in that, in that, in that golden 24 hour period. 

Joe Hancock, Partner (non-lawyer)

Mishcon de Reya

Yeah, no, absolutely, just quickly, I mean the thing I always say is you need to get a grip of the incident, which really means you know, you need to control the situation, not let it kind of control you.  The first 24 hours tend to be a bit chaotic.  What’s happened?  Who’s done it?  Where are things going?  Sometimes people are a bit reticent to kind of, to come forwards and kind of talk about what’s happened so, getting good kind of information, building up a kind of a factual matrix of what’s occurred.  I always say that one of the key things to do is again is that look for the kind of technical evidence that you’re going to need to be preserve, basically to go and look at it and making sure you know if there have been technical systems involved, be it email, be it your financial system, be it kind of communication systems, Teams, Zoom, just get hold of the technical data, hold it somewhere and then also on the kind of financial side of things, always say kind of like overreact to kind of initial weak signals, there will be no harm in contacting your own bank and saying hey, we think this is potentially fraudulent, you can undo that later, it’s one of those things that it’s not probably harmful to do today, there’s some kind of like no regrets type actions, you know if you are in that first 24 hours working out what’s gone on, stop of payments, don’t let stuff kind of continue, it’s unlikely to kill your business, you’ll know if it’s going to or not, a kind of a payment basis.  Yeah, so those to me are the kind of two key things which enable you to really get a grip of the situation. 

John Sendama, Partner

Mishcon de Reya

Yeah, just building on what you said there Joe, I think from my perspective the critical one is unsurprisingly, making sure that it’s been reported to the business’s bank because what will normally happen is they will send a message on the Swift system to the receiving bank that immediately says this is a suspected fraud, can you either recall those funds or freeze those funds in your account and the quicker that happens, the more likely you are to be able to capture those funds still in the account, so I think that’s one of the more critical ones in circumstances where there’s been a theft of funds.  Just I think we’ve got time for one more question and I think this is one for you Paul.  The plug-ins that you mentioned earlier and I think these are the detection plug-ins, do they work for voice notes and voice mails or just for visual deepfakes?  Or what is, what is the current position on those?

Paul Boskma, Senior Director

Mitek Systems

Well, Mitek Systems delivers technology, we’ve built technology so our own proprietary technology, we built technology for deepfake detection so these are models to tell you whether the image which you see on your screen is it a live person or is it a rendered, is the face more of something else, but we can do the same for voice prints, so even with voice, it’s if when the quality of the voice channel is good enough, we can tell you whether it’s a live voice or a recorded voice and we can also combine this into one section for checking, so we can do end voice and face at the same moment telling you, hey, you’re dealing with a live person and it is his real voice. 

John Sendama, Partner

Mishcon de Reya

Thanks.  And there are a few other questions in the Q&A chat box but we will pick these up offline as we are running out of time at the moment.  Just to close out, if I could ask each of you for one key takeaway for the audience in relation to defending against this type of threat.  Joe, if I could start with you.

Joe Hancock, Partner (non-lawyer)

Mishcon de Reya

Yeah, absolutely.  My key takeaways are always that you know every kind of minute you spend thinking about kind of this kind of stuff in advance of it becoming an incident is always going to be critical so, even if you just go away and think for twenty minutes over a coffee where would we see a deepfake problem?  Would we notice it?  What would we do about it?  That is going to be you know your twenty minutes kind of better planned than you would be and that will pay off exponentially if you can spend an hour on it, even better, two hours even better than that but everything you can kind of do in advance, just thinking through these scenarios doesn’t just improve outcomes for a deepfake type incident but for all kind of frauds and cyber incidents overall. 

John Sendama, Partner

Mishcon de Reya

And Paul?

Paul Boskma, Senior Director

Mitek Systems

Sorry.  Yeah, I think, sorry can you repeat the question?

John Sendama, Partner

Mishcon de Reya

Sorry, just one key takeaway for the audience.

Paul Boskma, Senior Director

Mitek Systems

Sorry, excuse me I was reading the chat.  I think deepfakes, it’s a kind of technology a human being is not capable of detecting anymore so you need to implement technology to support the user or the human to see whether it’s a real or not real.

John Sendama, Partner

Mishcon de Reya

Thanks very much.  So that’s all we have time for, I’d just like to say a massive thank you to Joe, Paul and the other John or fake James Libson for sharing their expertise in this discussion this afternoon.  Thanks to all of the Mishcon team for their hard work in pulling this all together and thank you to the audience for joining us.  As I mentioned earlier, there will be a recording available to everybody who signed up and we’ll also share our contact details for any more follow up queries and we’ll be answering the questions that were dropped into the Q&A so, thank you very much. 

In our latest Disputes Nightmare Scenario Flash Digital Session, John Sendama, Partner in our Fraud Department, Joe Hancock, Partner (non-lawyer) and the Head of Cyber Risk and Complex Investigations and Paul Boskma, Senior Director, Sales Engineering Europe at Mitek Systems, explore the rapid response needed to combat deepfake impersonation fraud, and the steps that can be taken to avoid it.

Speakers

How can we help you?
Help

How can we help you?

Subscribe: I'd like to keep in touch

If your enquiry is urgent please call +44 20 3321 7000

Crisis Hotline

I'm a client

I'm looking for advice

Something else