Real vs AI – The Face Off
Deepfakes have become a powerful tool attackers can deploy at every stage of the cyber kill chain.
From reconnaissance to execution, modern adversaries can now generate convincing identities, clone leaders’ voices, imitate employees on video calls with precision.
Using open-source tools and AI models available on platforms like Hugging Face and GitHub, creating weaponised deepfakes is accessible to anyone with basic skills.
In this live session, our cybersecurity experts break down how deepfakes are used to accelerate intrusions, bypass controls and amplify impact throughout 𝐭𝐡𝐞 𝐤𝐢𝐥𝐥 𝐜𝐡𝐚𝐢𝐧.
You’ll see how attackers use 𝐬𝐲𝐧𝐭𝐡𝐞𝐭𝐢𝐜 𝐜𝐨𝐧𝐭𝐞𝐧𝐭 at each stage of the cyber kill chain to create a multilayered deception, with each stage reinforced by real social content, trending topics and familiar voices that make the attack feel authentic.
Think you’d spot a deepfake? Most of our last attendees couldn’t.
What you’ll learn:
-
How deepfakes are deployed at stages of the cyber kill chain from reconnaissance and weaponisation to delivery, exploitation and exfiltration
-
Realistic attacker scenarios, showing how synthetic audio/video improves success rates
-
Validation techniques your team can apply instantly, even without specialist tooling
Why this matters — right now
Scope: Scattered Spider impersonated a Sys Admin employee using sophisticated voice authentication to execute access, lateral movement & a severe ransomware-based attack.
Impact: Impact: M&S have publicly stated the impact will be in the region of £300M, with weekly losses of £40M confirmed during the outage.
When one impersonated voice can cost hundreds of millions, recognising what’s real and what isn’t has never been more important.

