The Truth Behind Iran Slopaganda and Those AI Lego Videos

The Truth Behind Iran Slopaganda and Those AI Lego Videos

Propaganda used to require a massive budget, a film crew, and a distribution network. Now, it just needs a prompt and a decent GPU. Recently, social media feeds have been flooded with bizarre AI-generated videos showing scenes of conflict, martyrdom, and military might—all rendered in the style of Lego bricks. It’s being called "slopaganda." The term perfectly captures the intersection of low-effort AI "slop" and high-stakes political messaging.

If you’ve seen these clips, you’ve probably noticed the unsettling contrast. You have the playful, nostalgic aesthetic of childhood building blocks mashed up with grim depictions of Iranian regional influence or tributes to fallen commanders like Qasem Soleimani. It’s weird. It’s jarring. And it’s incredibly easy to make.

I spent time trying to replicate these specific visuals to see just how low the barrier to entry has become. It turns out, you don’t need to be a technical genius or a state-sponsored hacker to churn out this stuff. Anyone with twenty minutes and a basic subscription to a video generator can fill the internet with hyper-partisan plastic bricks.

Why Lego style is the perfect mask for hardline messaging

The choice of the Lego aesthetic isn't an accident. In the world of content moderation, platforms like Instagram, TikTok, and X have strict rules against depicting graphic violence or promoting "dangerous organizations." If a state-aligned account posts real footage of a missile strike or a bloody battlefield, the algorithm usually flags it within minutes.

AI-generated bricks bypass these filters.

Algorithms see colorful toys. They don't immediately recognize the political weight of a plastic figure holding a specific flag or standing in front of a recreated site of an explosion. This allows creators to push narratives that would otherwise be banned. It’s a loophole.

Beyond avoiding censors, there’s a psychological layer here. We associate these blocks with play and innocence. When you take a heavy, controversial topic and "toy-ify" it, you strip away the visceral horror and replace it with something digestible. It makes the ideology feel accessible. It’s gamified warfare.

Recreating the slop is surprisingly simple

I wanted to see if I could get the same look using commercially available tools like Luma Dream Machine, Kling, or Runway Gen-2. The results were scarily close to what's being circulated by pro-Iran accounts.

The process usually follows a three-step formula.

  1. Image Generation: You start with a tool like Midjourney or DALL-E 3. You feed it a prompt like "A cinematic wide shot of a Lego version of a military parade in Tehran, sunset lighting, highly detailed plastic texture."
  2. Video Animation: You take that static image and run it through an image-to-video AI. You tell the AI to make the figures move or have a drone fly over the scene.
  3. Sound Design: You overlay a dramatic, nationalistic soundtrack or a grainy voiceover.

The AI does the heavy lifting. It handles the lighting, the plastic reflections, and the physics. The "slop" part of the name comes from the fact that these videos often have glitches. Hands merge into torsos. Bricks melt into the floor. But for the purposes of a quick scroll on a phone, those details don't matter. The vibe is what sticks.

The shift from deepfakes to shallow content

For years, we worried about "deepfakes"—perfectly rendered videos of world leaders saying things they never said. While those exist, they’re hard to get right. They require high-quality source material and a lot of polish.

Slopaganda is the opposite. It’s "shallow." It’s meant to be churned out in high volumes. If one video gets taken down, ten more take its place. This is a war of attrition, not a surgical strike.

Western analysts have tracked various Iranian-aligned Telegram channels where these assets are shared before being pushed to broader platforms. They aren't trying to trick you into thinking it's real life. They’re trying to build a brand. By using a consistent visual style, they create a subculture. You start to recognize the "Lego Jihad" aesthetic, and it becomes a meme. Memes are far more infectious than dry press releases from a foreign ministry.

Why this matters for the 2026 information environment

We’re past the point of wondering if AI will affect politics. It’s already the primary engine for it. The danger isn't just the misinformation—it’s the sheer volume. When the internet is saturated with AI-generated filler, genuine information gets buried.

This isn't just an Iran problem. We’re seeing similar tactics across the globe. However, the Iranian "Lego" trend is unique because it targets a younger, digitally native audience. It meets them where they are, using the visual language of the internet to sell a very old-school form of nationalism.

The tech companies are playing catch-up. They’re trying to develop watermarking systems, but those are easily stripped away. They’re trying to train AI to recognize "AI-ness," but the generators are getting better every day.

How to spot the strings

You can usually identify this type of content by looking for the "uncanny valley" of the plastic. In the viral Iranian clips, the scale is often wrong. A Lego figure might be standing next to a "real" looking fire, or the shadows won't align with the brick geometry.

More importantly, look at the source. If a video has no clear creator but is being boosted by dozens of "bot-like" accounts simultaneously, it’s a campaign.

Don't let the bricks fool you. This isn't play. It's a calculated attempt to use Western technology to undermine Western narratives, all while hiding behind the aesthetic of a Danish toy company.

If you want to stay ahead of this, start by diversifying your feed. Follow researchers who track state-linked influence operations. Organizations like Graphika or the Atlantic Council’s Digital Forensic Research Lab often break down these campaigns before they hit the mainstream. Stop clicking on the "Lego" videos, even out of curiosity. Engagement—even negative engagement—tells the algorithm to show that content to more people. Treat it like the digital noise it is.

Quick checks for AI slopaganda

  • Check the limbs: AI still struggles with how Lego joints actually move. If the plastic "bends" like skin, it's AI.
  • Look at the background text: AI is notoriously bad at rendering Farsi or Arabic script correctly on signs or banners in the background. It usually looks like gibberish.
  • Verify the music: Most of these clips use the same four or five royalty-free "epic" tracks or AI-generated anthems that sound slightly off-key or repetitive.

The best defense is recognizing the pattern. Once you see the "slop" for what it is, the magic disappears. You’re no longer looking at a clever animation; you’re looking at a cheap, automated attempt to tilt your perspective. Turn off the autoplay and move on.

AC

Ava Campbell

A dedicated content strategist and editor, Ava Campbell brings clarity and depth to complex topics. Committed to informing readers with accuracy and insight.