Sexy internet celebrity MAGA supports Trump! The real face is actually an AI created by an Indian man, estimated to earn thousands of dollars a month

Indian medical students use AI to build sexy MAGA influencers, targeting conservative American men. They combine political and erotic content to harvest traffic, earning thousands of dollars per month. Experts are worried that this kind of virtual influencer could become widespread and turn into a tool for information warfare, triggering a crisis.

Sexy influencer MAGA-supporter Emily Hart—behind it is AI

Sexy influencer Emily Hart (Emily Hart) often shares beautiful lifestyle photos on social media. She’s a loyal MAGA fan of Trump. She opposes abortion, “woke culture,” and immigration—but her true identity turns out to be an AI created by a man.

Under the pseudonym Sam, a 22-year-old Indian medical student recently told the overseas media outlet Wired that, to raise the money for medical licensing exam fees and for future immigration to the United States, he used AI tools to create Emily Hart, spending only 30 to 50 minutes per day managing the social media account—allowing each short video to reach 3 million to 10 million views.

In just one month, Emily Hart’s account on Instagram accumulated more than 10,000 followers. Fans even pay to subscribe to her adult content on the competing platform Fanvue, or buy clothing with political slogans.

Sam estimates that this model could easily earn him several thousand dollars a month. However, good times didn’t last: in February of this year, Emily Hart’s IG account was already banned, though her Facebook account is still active.

Image source: The Independent UK. Sexy influencer Emily Hart (Emily Hart) supports Trump, but she’s actually AI

Business strategy of MAGA AI girls

Emily Hart’s success is mainly because Sam follows advice from AI tools: he targets older conservative American men with higher disposable income and higher loyalty as the primary audience, and he focuses on the “Make America Great Again” (MAGA) line while supporting Trump.

These AI-generated girls follow a specific operating template. They are usually set up as blonde white women, with jobs commonly in emergency-responder roles such as nurses, police officers, or firefighters. They wear bikinis printed with American flags, paired with posting extreme-right remarks supporting gun ownership, opposing abortion, or opposing immigration.

Sam revealed that because social media algorithms favor controversial content, these posts not only attract conservative supporters, but also draw liberals to leave comments criticizing them—thereby greatly boosting engagement rates.

This is an attention-harvesting strategy that combines patriotism with soft pornography: creators attract attention through political fervor, and ultimately funnel followers to paid platforms to monetize.

However, because the well-known adult platform OnlyFans strictly requires creators to be real humans, these AI creators typically direct fans to the Fanvue platform, which accepts AI-generated content.

From traffic monetization to information warfare, a flood of virtual influencers raises hidden concerns

Before Wired reported on Emily Hart, the Washington Post also covered an AI virtual female soldier named Jessica Foster in March. Jessica Foster had posed for photos with Trump and Russian President Putin, and that account attracted more than 1 million followers within 4 months.

Image source: Jessica Foster / AI virtual influencer. Jessica Foster’s account attracted more than 1 million followers within 4 months

Although Jessica Foster’s IG account has been banned, these MAGA AI girls are still raising concerns among experts.

Brookings Institution researcher Valerie Wirtschafter said that many fans don’t even care whether these influencers are real; they only care that the content matches their political identity; Boston University assistant professor Joan Donovan warned that such accounts are easy to set up and have clear profit incentives.

After all, the biggest risk of these AI accounts is that they could be converted into tools for information warfare—becoming bot-style networks that spread political propaganda and misinformation, which would also bring an unprecedented crisis of trust and social problems to online communities.

Further reading:
Classic Game: AI images spreading with Tokyo Dome littered after trashing, and the rumor-makers have already been listed as foreign-influence accounts

Hot posts trigger Taiwanese media false reporting: Honored’s climb of 101 photographer is Jin Guowei; in the AI era, media literacy faces new challenges

View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
Add a comment
Add a comment
No comments
  • Pin