Emerging Challenges

Unmasking Russia's propaganda playbook

A powerful new three-pillar framework now makes it possible to trace, expose, and defeat Russia's expanding disinformation operations.

A 2026 report by NATO exposed the scale and tactics of Russian disinformation operations [NATO/nato.int]
A 2026 report by NATO exposed the scale and tactics of Russian disinformation operations [NATO/nato.int]

Global Watch |

From fake Cartier scandals to Telegram armies and laundered corruption smears, Russia’s networks are flooding the West with disinformation.

A powerful new three-pillar framework now makes it possible to trace, expose, and defeat these operations.

Russian propaganda has evolved into a sophisticated cornerstone of Kremlin strategy, designed to manipulate public perception, destabilize democracies, and erode trust in institutions worldwide.

A landmark February 2026 joint report by the NATO Strategic Communications Centre of Excellence (StratCom COE) and Ukraine’s Centre for Strategic Communications exposes the scale and tactics of these operations.

Titled Attributing Russian Information Influence Operations, the study introduces and tests the Information Influence Attribution Framework (IIAF) -- a practical tool that combines technical, behavioral, and contextual evidence to trace campaigns back to their origins.

This framework equips governments, civil society groups, and technology platforms with reliable methods to detect, expose, and counter Russian influence operations more effectively.

Framework in action

Kremlin-aligned actors exploit digital platforms, crafted narratives, and coordinated behavior to hide their involvement.

Primary targets include Ukrainian civilians, neighboring EU countries, and European audiences already sympathetic to Moscow’s messaging.

French authorities have warned of an intensified Russian disinformation blitz targeting Europe, a campaign that reflects Moscow’s broader strategy of exploiting societal divisions within NATO countries.

The IIAF identifies three core evidence categories for credible attribution:

  • Technical evidence: Examines digital footprints such as IP addresses, domain registrations, metadata, hosting infrastructure, and SSL certificates. For example, the domain fondfbr.ru -- linked to Yevgeny Prigozhin’s network -- spread false claims about Ukrainian child deportations. Analysis of its Russian registrar and shared hosting revealed patterns of coordination and deliberate anonymity
  • Behavioral evidence: Focuses on activity patterns like cross-posting, synchronized amplification, impersonation, and comment flooding. An examination of pro-Kremlin Telegram channels uncovered identical reposting schedules and coordinated engagement, pointing to centralized control.
  • Contextual evidence: Analyzes narrative themes, timing, and alignment with geopolitical events. Pro-Kremlin corruption stories about Ukraine were systematically amplified during key moments, such as President Zelenskyy’s U.S. visits, to exploit public vulnerabilities.

Real-world case studies

The report applies the framework to documented Russian campaigns, revealing consistent methods and objectives.

One hallmark tactic is narrative laundering -- a three-stage process in which fabricated stories are seeded in fake accounts, amplified across mixed sources, and integrated into seemingly credible media to obscure their origins and boost legitimacy.

A clear example involved false claims that Ukraine’s First Lady, Olena Zelenska, spent $1.1 million at Cartier.

The story began with a fabricated Instagram video (traced to a St. Petersburg student with no Cartier connection), spread through suspicious accounts, and later appeared in Russian-aligned outlets.

Darren Linvill, co-director of Clemson University’s Media Forensics Hub, explains the tactic: “Like money laundering, narrative laundering tries to pass off inaccurate information as legitimate.

By making the bad information look like it’s from an ‘unbiased source,’ it gives the message a higher probability of being believed by a more general public.

Another case study examined corruption narratives pushed by pro-Kremlin channels and timed to geopolitical events.

These aimed to undermine trust in Ukrainian institutions and weaken international support.

A third example exposed a fake story about clashes between Georgian and Ukrainian soldiers.

The identical text and images appeared across at least 17 Kremlin-linked outlets, creating the illusion of organic amplification.

Sequencing anomalies and source substitution revealed central coordination.

Sustained assault

Russian information operations are not merely attacks on individual nations -- they represent a sustained assault on truth, trust, and democratic values.

As these campaigns grow more sophisticated, structured attribution frameworks like the IIAF provide a critical defense.

By documenting tactics consistently, analysts can expose coordinated inauthentic behavior and trace operational fingerprints across campaigns.

Every information consumer has a role.

Raising awareness, supporting independent fact-checking, and demanding accountability from platforms and policymakers help build societal resilience.

The fight against disinformation is ultimately a fight for democracy itself.

Do you like this article?