Have you seen the videos of Obama calling Trump a “total dipsh*t”, Mark Zuckerberg saying he has “total control of billions of people’s stolen data”, or Taylor Swift in a compromising position?
If so, you’ve seen a deepfake. It’s a form of artificial intelligence called deep learning that people use to make videos and images of fake events, hence the name deepfake.
How are deepfakes made?
It begins by collecting a large dataset of images or videos of a person, then using a deep learning algorithm, typically a Generative Adversarial Network (GAN), to learn and replicate their facial features and expressions.
The algorithm is trained until it can convincingly superimpose these features onto another person’s face in a video. Finally, the resulting video is refined to smooth out imperfections, ensuring a realistic and seamless deepfake.
Microsoft and Google lead the charge by training machine learning algorithms to detect patterns and anomalies in video and audio data so they learn to distinguish between genuine and deepfake content.
However, the tech is getting more sophisticated and easier to use. Which is bad news for politicians and celebrities, but what does it mean for you?
The threat to SMBs
While the celebrity video examples are very sophisticated, business owners are more likely to be victims of cybercrime from someone using audio deepfakes.
Similarly, the AI model is fed audio recordings of an individual speaking, allowing it to learn and recognise their unique speech patterns, tonality, and other distinctive vocal characteristics. The AI can then generate a synthetic replica of that person’s voice.
Then, a cybercriminal can make ‘you’ say anything.
That might be a phone call to someone in the accounts department to transfer a large amount of money. It could give access to sensitive data through voice-activated security. Or even make a controversial statement that gets leaked online or to the media.
The fallout can be catastrophic.
- Reputational damage. When a company’s reputation comes under fire, it erodes trust among customers, partners, and stakeholders. It could even negatively affect the share price.
- Financial Loss. That’s not just the money loss if an employee is tricked into paying a criminal. Loss of reputation also affects the bottom line, plus boosted cybersecurity and insurance.
- Data Breaches. The problem can spiral out of control if the criminals access sensitive corporate data with blackmail, ransom demands, and confidential client information leaked online all a possibility.
What you can do to mitigate the risk
- Train your team. Deepfakes should become part of your team’s cyber training, along with phishing, ransomware attacks and social engineering.
- Add multi-factor authentication (MFA). Extra layers of security will make things more challenging for the hackers and reduce the risk of access to information.
- Build a strict multi-factor process for money transfers. A quick phone call or voicemail should not be enough to move large amounts of money around. You might implement Time-based One-Time Passwords (TOTP) or a two-person authentication system. Although, the easiest and safest way is to get confirmation face-to-face.
While you might think it unlikely that someone will go to the trouble of creating a deepfake of you, the tech is getting more sophisticated and widespread amongst the cybercriminal community, so it pays to stay vigilant.
???? Synergy provides enterprise-grade security for small businesses at an affordable price. We can also undertake a Cybersecurity Assessment Report to give you peace of mind. Please contact us on 1300 796 796 to kickstart a chat about your risk profile.
Recent Comments