These Men Allegedly Profit Off Teaching People How to Make AI Porn

These Men Allegedly Profit Off Teaching People How to Make AI Porn

A little over a year ago, MG was leading the relatively normal life of a twentysomething in Scottsdale, Arizona. She worked as a personal assistant and supplemented her income by waiting tables on the weekends. Like most women her age, she had an Instagram account, where she’d occasionally post Stories and photos of herself getting matcha and hanging out by the pool with her friends, or going to Pilates.

“I never really cared to pop off and become popular on social media,” says MG (who is cited only as MG in the lawsuit to protect her identity). “I just used it the way most people did when it first came out, to share their lives with the people closest to them.” She has a little more than 9,000 followers—a robust following, but nowhere close to a massive platform.

Last summer, she received a DM from one of her followers. Did she know, the person asked her, that photos and videos of a woman who looked exactly like MG were circulating on Instagram? MG clicked the link and saw multiple Reels of what appeared to be her face superimposed onto a body that looked exactly like her own. The woman in the photo was scantily clad, with tattoos in the same places as MG.

MG was horrified. “If you didn’t know me well, you could very well think they were images of me,” she said. “It was kind of like this reality check that I don’t have any control over my own image.”

She was even more appalled when she discovered that not only were doctored nude or scantily clad photos of her being circulated on the internet, as she outlined in a recently filed complaint—they were also being used to advertise AI ModelForge, a platform that teaches men how to generate their own AI influencers. In a series of online classes and tutorials, the men allegedly taught subscribers to use a software called CreatorCore to train AI models using photos of unsuspecting young women, posting the resulting content on Instagram and TikTok.

“They provided a whole playbook, including instructions on how to pick the right person so that it’s not someone who can defend themselves, so they all had instructions on what type of women to use and where to get their pictures,” she claims. “It was disgusting on every single level.”

MG is one of three plaintiffs in a lawsuit filed in January in Arizona against three Phoenix men: Jackson Webb, Lucas Webb, and Beau Schultz, as well as 50 other John Does. The lawsuit alleges that the Webbs and Schultz scoured the internet for photos of unsuspecting young women, then used AI to generate photos and videos of fictional models who look exactly like them, selling such content on the subscription platform Fanvue.

The suit further alleges that for $24.95 a month on the platform Whop, the men sold courses online training other men, including the John Does named in the suit, how to make their own AI-generated influencers based on real women’s photos. The men allegedly created “Blueprints” for how to scrape images from women’s social media accounts and feed them into the generative AI model on CreatorCore, as well as a separate app that would remove the women’s clothes and generate sexually explicit images and videos. Such content, the suit claims, generated millions of views, reportedly generating more than $50,000 in income in one month. (The Webbs and Schultz did not respond to requests for comment.)

This moneymaking scheme, the complaint alleges, preyed on a “harem of indistinguishable AI copies of unsuspecting women and girls,” as well as instructing “predators seeking to prey on” women on social media. According to the suit, in 2025 the CreatorCore platform had more than 8,000 subscribers generating their own AI influencers, resulting in more than 500,000 images and videos.

AI ModelForge is one of many burgeoning companies seemingly looking to capitalize on the widespread use of artificial intelligence by teaching men how to create their own “AI influencers” as a side hustle of sorts. On platforms like X, self-styled entrepreneurs boast about their own patented methods for earning hundreds of thousands of dollars off AI models, luring in young tech-savvy men looking to earn a quick buck.

“The prevalence of this has been shocking to me,” says Nick Brand, who, with attorney Cristina Perez Hasano, is representing MG and the other two plaintiffs. The young men the lawsuit alleges are behind AI ModelForge are “targeting normal, everyday folks that have average social media profiles and social media followings.” One of the more insidious elements of this particular case, he alleges in an interview, is the use of the women’s images to teach other men how to find victims. According to the complaint, the defendants encouraged subscribers to target women with less than 50,000 followers to avoid “legal issues.”

“These boys aren’t just using generative AI to disrobe women—they’re selling the ability to do so to other men and boys, who are then going to use other women’s images to do the same thing,” Brand contends. MG and the other two plaintiffs, he claims, are “the face of a product that is harming other women. It’s like making somebody the face of ICE who has had their parents deported. It’s horrifying.”

Technically, there is a federal law preventing the proliferation of nonconsensual AI-generated porn. The Take It Down Act, which President Trump signed into law in May 2025, makes publishing nonconsensual sexualized AI-generated content illegal, requiring platforms to remove such content within 48 hours when it’s flagged. And most US states, including Arizona, have passed laws banning so-called “deepfake” porn. But the Take It Down Act does not go into effect until May 2026, and state laws tend to be “reactive rather than proactive,” says Arizona state representative Nick Kupper.

Earlier this year, Kupper introduced a bill in the Arizona legislature requiring websites to use automated detection tools, such as age verification or consent forms, to prevent nonconsensual AI content from being uploaded. “Once something’s online, it’s pretty much there forever, even though victims spend millions of dollars trying to take it down. It’s like whack-a-mole—you hit one, another one pops up.”

Currently, if you visit the Linktree page for AI ModelForge, it directs you to what appears to be the same business rebranded as “TaviraLabs,” a Telegram group with more than 18,000 members that advertises itself as “the #1 AI Influencer coaching community.” Additionally, the suit names more than a dozen Instagram accounts used by the defendants to promote AI ModelForge, most of which are still active. The suit details how such accounts continue to post photos of nubile women, fast cars, and expensive watches, writing captions such as, “She’s not my girlfriend, she’s my best paid employee” and “POV: You built her in 20 minutes and she made you $13.2k in the first 45 days.”

Even though MG and the other plaintiffs have continually lobbied Instagram to take their images down, many of them are still up, she claims, because they do not technically violate Instagram’s guidelines surrounding AI-generated content. When reached for comment, a spokesperson for Instagram said it had “extremely strict policies” around both AI- and non-AI-generated nonconsensual intimate imagery, removing accounts that post such content. When provided with a list of a dozen or so accounts thought to be associated with AI ModelForge, the spokesperson said the accounts were under review.

The suit also cites a number of TikTok accounts promoting the men’s business. When reached for comment, a TikTok spokesperson said the accounts were found to violate community guidelines and have been taken down.

MG says the images generated by AI Model Forge are distinct enough from her own photos that she frustratingly has been unable to claim that the accounts are impersonating hers, which is also a violation of Instagram guidelines. “It’s my face, my tattoos, on a different outfit on a slightly different body,” she says. “These are real women being transformed, not just a random AI-generated person.”

Though MG lives in constant fear of people in her lives seeing the pornographic AI-generated images of her, she says filing suit has given her a bit of her agency back. “We were put in this place where our backs were against the wall and I want other women to know you can’t stop living your life,” she says.

Still, what happened to MG, a woman with less than 10,000 followers, has daunting implications for virtually anyone with a remotely public online presence.

“It’s not about being cautious with your image online because everyone posts on social media now,” she says. “Everyone is on LinkedIn. Everyone is on Instagram. And I want people to realize that this could also happen to them.”

Related posts

A guide to APIs, MCPs, and MCP Gateways

Big Tech just proved AI infrastructure spending works. Then it raised the bill anyway

AI agent governance takes focus as regulators flag control gaps