Ads for apps that use AI to create fake videos of people kissing anyone they want are flooding social media platforms like TikTok and Instagram.
By Rashi Shrivastava, Forbes Staff
Meta and TikTok have run thousands of ads for apps that use AI to generate fake kissing videos of people, allowing users to upload photos of any two people and let AI convert them into a video of them kissing. The apps are being marketed as tools that would allow you to instantly “kiss anyone you want”— no consent required.
Similar in concept to “AI nudifier” apps that produce nonconsensual deepfake pornography, these AI kissing apps create believable videos of people doing something they didn’t do. And the ease at which they do it is a concerning habitualization of deepfake imagery.
While the ads are not sexually explicit like the deluge of AI-generated pornographic content that has engulfed social media platforms like Instagram, Reddit and YouTube, they can be equally dangerous, Haley McNamara, an executive at the National Center for Sexual Exploitation told Forbes.
“It does not have to be explicit to be exploitative,” McNamara said. “If it’s crossing boundaries to do something offline to someone without their consent, kissing, undressing, et cetera, then it’s also crossing boundaries to do that online.”
Meta has displayed over 2,500 ads for “AI kissing” apps across Instagram and Facebook, a Forbes review found. About 1,000 are currently active. TikTok has shown about 1,000 ads to millions of users in European countries, according to its ad library. (TikTok’s ad library doesn’t include ads shown to its U.S.-based users) Most of these ads depict celebrities like Scarlett Johansson, Emma Watson and Gal Gadot kissing one another. Others show videos of random people kissing, touting that AI could let you “kiss your ex” and “kiss your crush.” It’s unclear whether the people in videos are real or generated by AI. Johansson, Watson and Gadot did not respond to requests for comment. Futurism first reported the spread of these ads across social media platforms.
Meta is also promoting “AI hugging” apps, ads for which show AI-generated videos of children hugging cartoon characters like Dora the Explorer, Mickey Mouse and Tom and Jerry. Some, like a video showing a young girl hugging an older man, promise parents that AI-powered apps could allow their children to “hug grandparents they never met.” The social media giant has run about 1,200 AI hugging app ads, a Forbes search found. Over 300 are still live.
AI-generated videos of people kissing and hugging are already circulating across social media. A video depicting Taylor Swift hugging Kim Jong Un has about 30 million views on Instagram. In late December, a deepfake video of Elon Musk and Italian Prime Minister Giorgia Meloni kissing went viral on X.
Meta spokesperson Daniel Roberts told Forbes that these “AI kissing” ads do not violate the company’s policies. While nudity, sexually explicit and sexually suggestive content are against Meta’s advertising standards and Meta does not permit ads that “display, advocate for, or coordinate sexual acts with non-consenting parties,” videos of kissing and hugging are permitted.
After being contacted by Forbes, TikTok removed the ads for violating its policies. The video sharing platform requires advertisers to get consent from public or private figures represented in their ads, even if the ads are AI-generated, TikTok spokesperson Ariane de Selliers told Forbes.
The companies behind these video generators appear to be based outside the United States in countries like the United Arab Emirates, Italy and China, according to their websites. The apps are available for free on Apple’s App Store and Google’s Play Store and already have millions of downloads. The “AI kissing” feature is a part of a broader suite of AI-based photo editing capabilities across the apps, like touching up old photos, turning still images into videos and predicting what two people’s future babies would look like. Apple and Google did not respond to comment requests.
The spread of AI kissing apps, boosted by social media’s virality, illustrate a troubling mainstreaming of deepfakes in the age of generative AI. Use of these seemingly harmless apps could open the door to tools that could create more graphic imagery like deepfake porn and other types of image-based sexual abuse, McNamara said. “It’s just an absolute Pandora’s box,” she said.
“This trend is normalizing exploitative deepfakes and taking nonconsensual participation in intimate or sexualized images as a joke,” she said. “It’s the kind of thing that is easy to trivialize when you think about other people, but again, if you think about someone in your own sphere making images like this of yourself, someone who you wouldn’t want having images, I think everyone can recognize that that is a violation.”
That’s especially concerning when illegal AI-generated child sexual abuse material (CSAM) is on an unprecedented rise. Over the last two years, the National Center for Missing and Exploited Children (NCMEC) has received over 7,000 reports of generative AI-related child exploitation material. In one case, a pedophile allegedly filmed children at Disneyland and used popular AI tool Stable Diffusion to produce thousands of illegal images of them. And with unrestricted access to AI image generators, teenage high school students in multiple instances have created deepfake nude imagery of their underage classmates, some resulting in criminal charges. Meta has also struggled to police ads for such AI “nudifying” sites, one of which has seen 90 percent of its traffic coming from Instagram and Facebook, according to the Faked Up newsletter.
People who have encountered the AI kissing ads while scrolling through social media say they find them disturbing. In December, Alice Siregar started seeing an uptick in ads for AI kissing apps on TikTok. An AI analyst at a tech consulting firm herself, she was dismayed to see the technology being used in a “deeply unethical” way.
“It was incredibly creepy to encounter,” she said.
Read the full article here