ScamWatch

If you feel you're being scammed in United States: Contact the Federal Trade Commission (FTC) at 1-877-382-4357 or report online at reportfraud.ftc.gov

Dating Deepfakes: How Scammers Use AI Photos to Build Trust and Ask for Money

A couple shares a tender moment holding hands at an intimate café table indoors.

Introduction — The new face of online romance scams

Dating has always carried risk; what’s changed is the tools scammers use. Today, artificial intelligence can generate near-photoreal images and synthetic videos that impersonate attractive, trustworthy people — and those images are being used to build fake romantic relationships that lead to financial loss. Recent industry and security research show a notable rise in AI-driven romance scams and growing concern about how difficult it is for ordinary users to tell real photos from AI-made ones.

This article explains how dating deepfakes are created and deployed, the psychological tactics scammers use, practical red flags and verification steps, and what to do if you — or someone you know — becomes a target.

How scammers create and use AI photos

Scammers assemble believable fake personas using a mix of techniques:

  • AI-generated portraits: Generative image models (GANs and diffusion models) produce realistic faces and photos that do not belong to real people.
  • Image mixing and face swapping: Scammers combine real photos with generated elements to avoid straightforward reverse-image detection.
  • Synthetic video and voice: Advanced deepfakes can create moving footage or mimic voices for fake video calls.
  • Profile amplification: Multiple fake accounts and staged social posts create the illusion of a network, making the profile feel legitimate.

Operators then groom targets by moving conversations off-platform, expressing fast emotional attachment, and introducing urgent money requests — often framed as medical emergencies, travel costs, legal fees, or "investment opportunities" such as fake crypto schemes. Law-enforcement agencies and cybersecurity vendors have documented a surge in these AI-enabled schemes and have linked some operations to organized groups that build sophisticated scams at scale.

Spotting red flags and verifying profiles

Because humans and simple checks can fail to detect modern deepfakes, use a combination of behavioral and technical checks. Research shows people commonly misidentify AI images as real, so rely on layered verification rather than a single clue.

Quick red flags

  • Refusal or excuses to have a live video call (constant "camera problems").
  • Pressure to move off the dating platform quickly (private email, messaging apps, or cryptocurrency sites).
  • Fast declarations of love and requests for money or gifts.
  • Profiles with very few friends/followers or inconsistent social footprints.
  • Images that look "too perfect" or show visual anomalies (odd backgrounds, mismatched reflections, blurred edges).

Practical verification steps

  1. Do a reverse image search on profile photos (Google, TinEye) to find duplicates or stock-photo matches.
  2. Ask for a live short video or time-bound selfie (e.g., "hold up today’s newspaper or make a specific gesture").
  3. Check social profiles for history: consistent posts, tagged friends, and plausible timelines.
  4. Use platform safety tools (report suspicious accounts, request verification, and use built-in blocking features).
  5. If financial or investment requests arrive, treat them as scams: don’t send money, and verify independently with trusted third parties.

Romance and confidence fraud continue to cause high losses worldwide—FBI and IC3 data show romance and confidence frauds remain costly and widespread, often involving wire transfers or cryptocurrency that are hard to recover. Reporting helps build cases against networks that exploit victims.

When deepfakes are used by organized crime

Large-scale operations have been uncovered by police collaborations across countries, showing that some networks use deepfake photos and polished scripts to groom many victims at once and funnel funds through complex channels. Staying informed and reporting suspicious activity supports law enforcement investigations.

If you suspect a profile is a deepfake

  • Stop communication immediately.
  • Do not send money, gifts, or personal identity documents.
  • Preserve evidence: save messages, screenshots, profile URLs and payment records.
  • Report the account to the dating platform and to local law enforcement or the FBI’s IC3 if you are in the U.S.