Disinformation Intervention Model
Disinformation adapts. Consequences evolve. Paradigms shift. Next-generation interventions emerge.
Read the 2026 ManifestoThe Decade of Disinformation
Scroll through the events that shaped our understanding of information warfare - and the interventions that emerged in response.
The Awakening
Russia's annexation combined military action with sophisticated information operations. The West had no playbook.
Terrorist organizations mastered social media for radicalization and recruitment.
Competing narratives emerged instantly. Truth became a battlefield.
If adversaries spread propaganda, counter with truth. Governments established StratCom units and funded independent journalism.
The Election Shock
Russian troll farms, leaked emails, and viral misinformation. "Fake news" entered the mainstream vocabulary.
Verify claims, label them true or false, trust people will update their beliefs. Platforms partnered with fact-checkers at scale.
The Data Scandal
Psychological profiling meets political manipulation. The scale of data exploitation was revealed.
Instead of correcting falsehoods, teach people to recognize manipulation techniques. Games like Bad News were born.
The Infodemic
Misinformation became a public health crisis. Anti-vax content, miracle cures, and conspiracy theories spread faster than the virus.
Deplatform, derank, demonetize. The EU's Digital Services Act emerged. Content moderation became serious business.
The Insurrection
Online conspiracy theories manifested as real-world violence. The stakes of disinformation became undeniable.
Deplatforming came too late. Users migrated to alternative platforms. The fundamental question emerged: who decides what's true?
The Platform Shift
Real-time information warfare. Competing narratives from day one. OSINT communities emerged as truth-seekers.
A philosophical shift in platform governance. Content moderation policies thrown into chaos.
Focus shifts from content to connection. Deep canvassing, street epistemology. People change minds through conversation.
The AI Explosion
AI goes mainstream. The cost of creating convincing content drops to near zero.
Continued retreat from traditional content moderation. Community Notes as the new model.
Parallel information realities form instantly. Different communities see entirely different "facts."
The Global Election Year
US, EU, India, and dozens more. The biggest test of information integrity in history.
Europe's Digital Services Act goes into full effect. New obligations for platforms.
Synthetic media becomes indistinguishable from reality. Audio, video, images - all fakeable.
No One Is Coming
The largest platform abandons third-party verification. Community Notes becomes the only model.
Official channels spread contested claims. Fact-checkers labeled as enemies. The state is no longer the anchor.
Users scatter across platforms. No single town square. Fragmented information ecosystems.
No cavalry is coming. Communities must build their own capacity for distributed verification, local deliberation, and shared coordination.
The question is no longer "who has the answers?"
It's "how do we build the capacity to navigate uncertainty together?"
The Problem with Interventions
Gullible individuals do exist; they are exceptions not examples
Most people aren't fooled by obvious disinformation. The challenge is more nuanced.
There are no identical patterns
No disinformation campaign is created equal. Campaigns differ over time, spatially, and tactically.
There are no silver bullets
No single intervention works for all contexts. The dynamic context is crucially important.
Six Generations of Interventions
From information campaigns to collective sensemaking
Crimea shocked the West into awareness. If adversaries spread propaganda, counter with truth. If citizens are confused, inform them.
The 2016 elections revealed viral misinformation. If false claims were spreading, verify them, label them, and trust people will update their beliefs.
By letting potential receivers experience how disinformation is created and spread, they are getting inoculated.
COVID + political violence demanded action. Deplatform, derank, demonetize. Few producers: obstruct them, problem shrinks.
Disinfo thrives in alienated, left-behind communities. People change minds through interaction, not broadcasts. Create safety, autonomy, belonging.
No authority has the answer. Communities must develop their own capacity for collective sensemaking and coordinate responses together.
The Three Shifts
The foundations of external authority have collapsed. Three simultaneous shifts demand a new approach.
AI & Synthetic Content
Post-ChatGPT explosion of 2023. Deepfakes flooded the 2024 elections. Cost of disinformation production: near zero. Verification by individuals: nearly impossible.
Government as Disinfo Source
From Ukraine to domestic politics, official channels now spread contested claims. Fact-checkers labeled as enemies. The state is no longer the anchor.
Platform Collapse
Meta drops fact-checking (Jan 2025). X abandoned moderation. No platform will save us. Users must organize themselves.
Every previous generation assumed someone else would solve it. That assumption was always flawed.
The Path Forward
Each generation was built on a flawed assumption - that some authority could solve the problem
The Challenge Ahead
No One Is Coming To Save Us
Disinformation adapts. Consequences evolve. Paradigms shift. The question is no longer "who has the answers?" but "how do we build the capacity to navigate uncertainty together?"
Ready to Learn More?
Read our publications or get in touch to discuss the DIM model.