The account drew conservative followers, sold merchandise and adult content, and was later banned by Instagram.
NEW YORK, N.Y. — An AI-generated Instagram personality called Emily Hart, which drew conservative followers with pro-Trump posts, rifle-range photos and glamour shots, was unmasked this week as the work of a 22-year-old medical student in India who said he built the account to make money online.
The disclosure matters beyond one fake profile because it shows how cheap AI tools, political branding and platform incentives can blend into a profitable online business. In reporting published Apr. 21, Hart emerged as one of several synthetic right-wing influencer accounts circulating on major social platforms. Instagram had already banned Hart’s main account in February for activity the platform labeled fraudulent, but related pages and lookalike accounts still illustrated how uneven AI labeling, identity checks and enforcement can be when fake personas are built to chase attention and cash.
Sam, a pseudonym used at his request, said he was short of cash while studying medicine in northern India and still hoped to move to the United States after graduation. He first tried other online side jobs, including short videos and the sale of study notes, before turning to AI-generated glamour pictures. Generic model posts did little. He then asked Google’s Gemini chatbot how to make the persona stand out. According to the account he later gave reporters, the chatbot suggested a conservative niche because the audience could be loyal and willing to spend. Last January he created Emily Hart, describing her as a nurse and a Jennifer Lawrence look-alike. The feed mixed bikinis, beer, flags, guns and blunt slogans about Christianity, abortion, immigration and liberals. Sam said the page soon “blew up,” with some Reels drawing millions of views.
Within about a month, he said, the Instagram account had topped 10,000 followers. Sam said he could not make money directly from Instagram, so he pushed the character into side businesses. He sold MAGA-themed T-shirts and used Fanvue, a platform that permits AI-generated creators, to offer sexualized images and direct messages under Hart’s name. He said the work took only “30 to 50 minutes” a day and brought in a few thousand dollars a month, a sum he described as large by student standards in India. Those income claims, like the biggest view counts he cited, were not independently verified in the available reporting. What does appear consistent across the account’s rise is the business model. Hart was built as a commercial product first and a political persona second, with outrage serving as part of the sales funnel and each viral argument widening the reach of the brand.
The story also drew attention because it landed in a debate over disclosure rules that already existed on paper. Meta said in 2024 that it would begin applying “Made with AI” labels to organic AI-generated images, audio and video when it detects industry-standard signals or when users disclose them. The company also said it may add stronger labels when altered media risks materially deceiving the public and that it will still remove content that violates other rules. Hart’s posts were not labeled as AI-generated before the account vanished. The platform later removed the Instagram page for “fraudulent” activity. A Google spokesperson said Gemini is designed to provide “neutral responses” rather than favor a political viewpoint. Fanvue’s published rules say AI-generated media is allowed, but it must be clearly disclosed and cannot be misleading, deceptive or used to impersonate others.
Experts say the Hart case fits a pattern rather than a one-off trick. Valerie Wirtschafter, a Brookings Institution fellow who studies emerging technology and democracy, said AI has made fake profiles “more believable” and easier to scale. The political styling also matters. Young women on the right attract unusual attention online because they are rarer than young conservative men in that age group and because outrage-heavy posts travel fast, whether supporters or critics boost them. A Washington Post investigation in March described a similar case involving Jessica Foster, another blonde pro-Trump persona presented as a U.S. Army service member. That account gathered more than 1 million followers in just over four months before scrutiny intensified and the page was removed. Together, the cases suggest synthetic identities can succeed even when visual glitches, impossible life details and platform-rule gaps are visible in plain sight.
No public criminal charges or civil claims connected to Hart had been identified in available reporting by Wednesday, and the known record remains mostly a media account built from interviews, platform statements and archived social posts. That leaves several questions unresolved. It is not publicly known how many paying subscribers Hart drew across platforms, how much money was actually collected, whether followers tried to recover payments, or whether any agency in the United States or India has examined the operation. The procedural picture on the platforms is clearer. Instagram removed the main account in February. A Facebook page tied to Hart was still reported to be active at the time the story broke. Fanvue’s current guidelines warn that undisclosed AI content, deceptive synthetic media and impersonation can lead to removal, suspension or permanent bans. The next concrete step is more likely to come from platform enforcement than a courtroom unless a regulator, payment processor or complainant turns the episode into a formal case.
Part of Hart’s power came from the bluntness of the performance. The feed presented an instantly recognizable character for both the algorithm and the audience it sought: a blonde nurse who loved Christ, guns, cold beer, ice fishing and Donald Trump. The pictures were tailored for quick scrolling, with flag prints, low necklines, rifle ranges and cabin backdrops arranged beside culture-war captions meant to spark either loyalty or anger. Sam said he also tried building a left-leaning counterpart and found that version did not catch on. His explanation was insulting and simple. He said the conservative audience was “super dumb.” That line helped the story spread even faster after Hart was exposed, but the deeper point may be simpler. Some users no longer treat authenticity as the first test. If an image feels right and the message flatters their beliefs, they may click, comment, subscribe and pay before they ask whether the person on the screen exists at all.
As of Apr. 22, Hart’s central Instagram account remained down and the student behind it said he was shifting back to his medical studies. The next milestone will be whether Meta, Fanvue or regulators explain how similar AI political personas keep reaching large audiences before disclosure or enforcement catches up.
Author note: Last updated April 22, 2026.