Technology

Fake Instagram Influencers Built with AI Are Fooling Millions—And Raising Questions About Authenticity

AI-generated Instagram influencers with thousands of followers are faking appearances at real events and spreading false information, while operating in organized networks to monetize their synthetic

Martin HollowayPublished 2w ago5 min readBased on 8 sources
Reading level
Fake Instagram Influencers Built with AI Are Fooling Millions—And Raising Questions About Authenticity

Fake Instagram Influencers Built with AI Are Fooling Millions—And Raising Questions About Authenticity

Two AI-generated Instagram influencers named Santos Walker and Caleb Ellis posted photos claiming to attend the premiere of The Devil Wears Prada 2. The images, including a "Get Ready With Me" video montage posted on April 23, were entirely fictional—neither the influencers nor the photos were real. The stunt was created without any involvement from 20th Century Studios, drawing swift criticism across social media for spreading false information about a major entertainment event.

This incident highlights a larger pattern emerging on Instagram: artificial influencers are quietly accumulating substantial followings. Jae Young Joon, created by a Canadian named Luc Thierry, has amassed over 320,000 followers. Romeo DeSouza, another AI-generated influencer based on Dutch-Brazilian aesthetics, maintains 56,000 followers and apparently runs a coordination channel—essentially a private chat group—where the creators of these AI influencers share techniques and discuss strategy.

How Platforms and Users Are Responding

The fake red carpet photos sparked backlash from users who felt misled by content pretending to be real. Meta, Instagram's parent company, has started responding to complaints about AI-generated accounts, though the company has not publicly detailed exactly how it is enforcing rules against them.

The criticism goes deeper than one stunt. Users across social media are reporting what they call an "epidemic" of fabricated profiles featuring attractive men who claim to be gay, often highlighting sobriety or recovery. Many of these accounts reuse content from legitimate creators without asking permission—essentially building a counterfeit influence economy on top of real human effort.

This development concerns human influencers. AI-generated creators can post on a consistent schedule and maintain a perfectly curated look without the real-world constraints that affect actual people: travel, illness, burnout, or simply running out of ideas. That operational advantage could shift brand partnerships—and the money that comes with them—away from human creators to machines.

The Surprising Audience Mismatch

One quirk in how these AI influencers are spreading is worth examining. Jae Young Joon's content is aimed at gay male audiences, yet most of his followers are women. This mismatch suggests either Instagram's algorithm is pushing the account to unexpected audiences, or the content appeals across demographics in ways the creator may not have anticipated. The mechanics behind this pattern are unclear from public information.

Romeo DeSouza's account takes a different approach: the Instagram bio explicitly states it is an AI creation, offering more transparency than many synthetic influencers. Whether this disclosure reflects the creator's own choice or pressure from the platform is unclear, though Instagram's policies around labeling AI-generated content are still taking shape.

Behind the scenes, creators of these AI influencers have built networks to share techniques and discuss how to monetize their work. This infrastructure—private channels, coordination among operators—suggests the space has moved beyond hobbyist experiments into something more like a commercial operation.

The Money Question and Broader Pattern

These AI accounts are making money, though exactly how much and through what methods remains mostly hidden. The revenue likely comes from the standard ways influencers earn: sponsored posts, affiliate marketing (where they get a cut of sales they drive), and subscriptions. The twist is that AI-generated content eliminates many of the costs and delays involved in human content creation.

The pattern here resembles something we have seen before. When YouTube first allowed creators to earn money from videos around 2005, it sparked similar tensions. Established media outlets worried that anyone with a camera could now compete for audience and advertising dollars. The questions were the same: who counts as legitimate, and is the competition fair. The difference now is that we are not just giving regular people powerful tools—we are potentially removing human creators from the competition entirely.

Within the gay community and among body image advocates, concerns have surfaced about these AI influencers promoting impossible physiques. Santos and Caleb drew criticism for their exaggerated muscularity. Other AI influencers consistently present unrealistic body types, and the concern is real: followers who believe they are watching real people may internalize these images as standards they should match. Luc Thierry, who created Jae Young Joon, has publicly acknowledged this concern and expressed understanding of why human influencers and community members feel troubled by it.

The Political Dimension and Bigger Picture

The AI influencer phenomenon is not confined to lifestyle and entertainment. Reports indicate that thousands of AI-generated accounts promoting political messages—specifically pro-Trump content—are spreading across social media platforms. This suggests synthetic influence campaigns are expanding into areas where false information carries higher stakes.

That political angle introduces fresh challenges. Platform policies were written for human-operated accounts posting human-made content. Those rules leave gaps that AI influencers can slip through. The red carpet fabrication illustrates the point sharply: when synthetic influencers can invent attendance at real-world events, the potential for misinformation jumps beyond personal authenticity into false claims about actual business relationships and real-world occurrences.

The speed at which AI influencers have amassed audiences and begun monetizing is worth noting. It suggests that the barrier to entry for creating sophisticated AI personas has dropped significantly. Tools that once required specialized technical knowledge are now accessible to non-experts. The financial and social rewards for deploying them continue to grow. This combination typically accelerates adoption.

The story of Santos Walker and Caleb Ellis may ultimately be less important than what they represent: a growing infrastructure for AI-generated influence that operates in coordination networks, accumulates real money, and has begun moving beyond entertainment into political content. Platforms, creators, and audiences are all grappling with a basic question that technology keeps raising in new forms: how do you know what you are actually seeing, and who is responsible for telling you the truth.

Fake Instagram Influencers Built with AI Are Fooling Millions—And Raising Questions About Authenticity | The Brief