
Protecting yourself from AI and deepfake scams
Artificial intelligence (AI) has opened new avenues for criminals to carry out fraud, especially through deepfakes. AI can help a scammer quickly produce convincing photos and messages through text or voice cloning. Many of the same scam tactics have existed for years, but scammers are using AI to make these scams more convincing and harder to recognize. They might steal the likeness of famous CEOs or politicians to promote bogus investments or even mimic the voice of a friend or family member to ask for money.
In 2025, scammers created deepfake videos of well-known figures like the Canadian prime minister to peddle a fake cryptocurrency scam. Many victims unfortunately had their money stolen through this means. Some New Brunswickers have been victims of these scams, with criminals taking $16,000 from one person!
Types of Deepfakes and Generative AI
Text-Based AI Scams
Scammers can use AI to create personalised phishing emails, fake social media messages, or even pretend to be someone else in a chat.
Voice Cloning Scams
Also known as "deepfake audio", voice cloning involves using AI to copy someone’s voice. Scammers use voice cloning in vishing (voice phishing) calls. For example, calling someone and pretending to be a relative in distress or a bank official.
Deepfake Image and video scams
Deepfake images are AI-generated images or videos that swap or mimic someone’s likeness. This can be a fake profile picture or a video of someone appearing to say things they never did.
How are AI and deepfake scams used?
Below are some of the most common ways scammers are using AI right now:
Investment Scams
Swindlers use deepfakes to impersonate financial gurus or celebrities touting a “can’t-miss” investment. In these scams, the fake video or audio convinces people to invest in phony crypto coins, stock schemes, or “exclusive” deals. Once the money is sent, the crooks vanish, leaving the victim with nothing.
Scammers are also using AI to create false or misleading information to drive up demand for a stock, causing its price to rise. Once the stock price is inflated, they sell off their own shares for a profit. After they stop promoting the stock, its value usually drops sharply, leaving other investors with significant losses. This method is called the pump and dump.
Romance Scams
You might encounter someone on a dating app or social media who seems charming, but the photos or even video chats of them may be deepfakes. Scammers use AI face-swapping and voice effects to mimic real people. Using fake accounts, they build an online relationship over weeks or months. Eventually, they will ask for money or personal financial information.
Emergency Scams
Scammers can use AI to imitate the voice or video of a family member. Sometimes called the “grandparent scam,” the fraudster will call you using a voice clone of a close family member or friend and claim they’re in an emergency and need money immediately.
Celebrity Endorsement (Ad) Scams
If you see a video ad of a celebrity promoting a product or service that you’ve never heard them mention before, be skeptical. Scammers can make fake endorsement ads with the image and likeness of a famous person. Scam artists also use government officials in their deepfake videos to convince victims of investment opportunities or to convince them to give out sensitive information.
Identity Theft
AI deepfakes are also fueling new forms of identity theft. Criminals might use stolen personal data alongside deepfake images to pretend to be someone else and gain access to their money. Deepfake technology can fool facial recognition or video verification systems by presenting a realistic fake face on a live video chat or ID photo. Scammers can also record people’s voices (with or without consent) and then use voice AI to bypass security questions or voice-authentication systems.
How to Protect Yourself from AI deepfake scams
Be cautious with unsolicited calls and messages
Don’t automatically trust caller ID – criminals can make numbers look familiar. Be cautious of unexpected phone or video calls from people you know; if something feels off, it’s okay to hang up and verify through another method. Having a family “code word” is also a smart idea. It could be a question or password only your real family knows.
Don’t overshare online
Review your social media privacy settings. If you make every detail about your relationships, travels, and work public, you could give fraudsters the data they need to impersonate you or someone you know. Share wisely and consider limiting who can see your content.
Practice cyber safety
Educate yourself about common phishing tactics. AI might be used to make scams more convincing, but basic practices like not clicking suspicious links and not downloading unknown attachments still help protect your money.
Pause and check
Before using a trading platform, or working with an investment advisor, always check if they are registered – even if they are online. In New Brunswick, trading platforms are required to be registered with the Commission and investment advisors must be registered with the Canadian Securities Administrators (CSA). Contact the Commission if you have any concerns about unsolicited investment opportunities or about the legitimacy of an investment opportunity.
Stay informed and pause before reacting
Awareness is a powerful tool. Learn about the different types of scams in our Fraud and Scams Database.