ScamWatch

If you feel you're being scammed in United States: Contact the Federal Trade Commission (FTC) at 1-877-382-4357 or report online at reportfraud.ftc.gov

Romance Scams Go AI: Generated Photos, Scripted Grooming and Crypto Requests

Close-up of a woman holding a smartphone displaying the Instagram app indoors.

Introduction — Why AI Changes the Romance‑scam Landscape

Romance scammers have long relied on fake profiles and persuasive stories. Today, readily available generative AI — from image generators to large language models and voice‑cloning tools — lets fraudsters produce believable photos, personalised messages and synthetic voice or video interactions at scale. Law enforcement and advisory agencies warn that AI is increasing both the volume and the plausibility of romance fraud, and that many scams now transition victims into irreversible cryptocurrency payments.

This article explains the new AI‑enabled tactics, outlines practical steps platforms should take to reduce harm, and gives clear guidance users can apply immediately to protect themselves and loved ones.

How AI Is Being Used in Modern Romance Scams

Scammers combine several automated tools to build trust quickly and then extract money or crypto. Common patterns include:

  • AI‑generated profile photos: High‑quality, flattering portraits created by image generators or stitched from multiple real images to avoid reverse‑image matches.
  • Scripted grooming using LLMs: Chat and message templates produced or refined by large language models that emulate intimacy, shared interests and emotional escalation.
  • Voice and video deepfakes: Short personalised voice messages or staged video dates that appear to speak directly to the victim.
  • Conversion to crypto: After trust is built, victims are encouraged to "invest" in crypto or send cryptocurrency to accounts that are effectively unrecoverable.

Industry monitoring and blockchain investigators report steep increases in AI‑assisted crypto and romance fraud; analysts say these operations can scale rapidly because AI reduces the cost of producing convincing personas and multi‑language scripts. Law enforcement takedowns continue, but tracking funds and recovering cryptocurrency remains difficult.

What Platforms and Marketplaces Should Do Now

Dating apps, social platforms and marketplaces must treat AI‑enabled romance scams as a platform safety priority. Recommended measures include:

  • Stronger onboarding checks: Use multi‑signal identity verification (passkeys, government ID checks where lawful, liveness/photo challenge, behavioral signals) and ensure verification flows are transparent to users.
  • Image provenance and similarity detection: Deploy image‑forensics and reverse‑search supplemented with AI detectors for synthetic imagery; mark high‑risk accounts for review.
  • Behavioral and content signals: Rate‑limit new accounts, flag accounts that use repeated scripted responses or rapidly escalate to money/crypto topics, and monitor cross‑account messaging patterns that indicate organized grooming.
  • Payment friction and warnings: Intercept and warn users when conversation topics shift to investments, crypto, or requests for irreversible payments; provide pre‑populated reporting buttons and delays before allowing external payment links.
  • Partner with crypto tracing firms and law enforcement: Build rapid‑response paths to share indicators of compromise and suspected wallet addresses; combine takedown and tracing rather than relying solely on account suspension.
  • Transparency & user education: Surface clear, persistent guidance on the profile page about common scam tactics, and prominently explain how to report suspicious profiles.

These steps reflect evolving law‑enforcement guidance and multi‑stakeholder recommendations to stem AI‑assisted fraud operations. Platforms that invest in combined technical detection, human review and industry collaboration are most effective at disrupting these crime chains.

Practical Steps Users Can Take Right Now

If you use dating apps or social media, adopt these concrete habits:

  • Take a beat: If someone you met online asks for money or a "sure" investment, pause and verify offline. Rapid pressure and urgent emotional stories are red flags.
  • Verify identity: Ask for a live video call, request a specific on‑camera gesture or a selfie holding today’s newspaper (or similar), and check whether the person refuses or repeatedly delays.
  • Reverse‑image search: Use search engines or a reverse‑image tool; AI‑generated faces often lack consistent background or produce mismatched metadata.
  • Guard payments: Never send funds via gift cards, P2P apps, or crypto to someone you haven't met in person. Cryptocurrency transactions are typically irreversible and commonly used in these scams.
  • Report quickly: Report suspicious accounts to the platform and to national authorities (for U.S. residents, file a complaint with the FBI IC3 and the FTC). Keep copies of messages, transaction details and wallet addresses.
  • Seek support: Romance fraud causes emotional harm. Contact family, consumer hotlines or victim‑support services; consider freezing credit if you shared financial details.

Prompt reporting not only helps you — it provides critical data to platforms and investigators that may disrupt broader criminal networks.