Bumble has added a new feature that lets users report profiles with AI-generated photos and videos.
Now, when reporting a profile, users can choose “Fake profile” and then select “Using AI-generated photos or videos.”
This option is in addition to existing categories like inappropriate content, underage users, scams, and the use of someone else’s photos.
AI-generated photos are common on dating apps and are often used to deceive or scam users.
Risa Stein, Bumble’s Vice President of Product, stressed the importance of removing misleading or dangerous elements to keep the platform safe and trustworthy.
In February, Bumble launched an AI tool called “Deception Detector,” which has reduced reports of spam, scams, and fake profiles by 45%.
Bumble also uses an AI-powered “Private Detector” tool that automatically blurs nude photos.
Looking ahead, Bumble’s founder, Whitney Wolfe Herd, envisions AI “dating concierges” that could go on dates on behalf of users to find the perfect match.

