Guest Post: Why the Authoritarian Playbook Works in Information Warfare, and What to Do About It
The West has clung to the illusion that war begins with tanks crossing borders and missiles filling the sky — and ends with peace treaties. But modern conflict starts much earlier and last far longer
After almost half a century of the Cold War, Western countries breathed a sigh of relief and began downsizing departments, even shutting down entire units responsible for efforts on the cognitive battlefield. Meanwhile, adversaries like Russia and China kept going, learning, and adapting. Now we’re witnessing the payoff of their decades-long investments.

For them, the war never ended — it simply moved to a more covert domain: the cognitive one.
Losing ground against a rising threat
At almost every conference I attend, I hear the same thing: Western capabilities to respond to cognitive threats from Russia and China have significantly weakened. The recent Institute for the Study of War report states that cognitive warfare is Russia’s way of war, governance, and occupation. There’s a pragmatic logic behind it: it's how Russia compensates for its technological backwardness compared to the West.
Occupation didn’t happen overnight
In 2014, when Russia invaded Ukraine and occupied Crimea, it was the perfect embodiment of Sun Tzu’s classic quote: “The supreme art of war is to subdue the enemy without fighting.”
Within days, Russia captured 85% of Ukraine’s fleet, with no resistance, not a single bullet fired. Moreover, half of the Ukrainian military personnel on the peninsula voluntarily sided with Russia.
That didn’t happen out of nowhere. It was the result of a well-planned, years-long campaign. For decades, Russia had been spreading its narratives inside Crimea, gradually undermining trust in Kyiv and building local loyalty.
"We are the Black Sea Fleet, not Ukrainian. What does Ukraine even have to do with it?"
That was the narrative Russia cultivated. They eroded the idea of Ukrainian sovereignty by pushing local identity and pro-Russian narratives. Add to that local corruption, unified messaging, the Russian language, Russian media, joint military drills — and the picture becomes clear. There was no need for tanks or ships to achieve this.
One of the most common mistakes is still perceiving information warfare as "soft power" — public diplomacy at best. But this is a fatal misjudgment. Cognitive security is not adjacent to national security — it is national security.
Ukraine is an open-air laboratory for Russia’s long-term authoritarian strategy of cognitive warfare. Strategic corruption, energy dependence, cultural, economic, religious, and of course, information dependencies — all of these are elements of the war.
Ukraine’s frontline experience shows that information warfare isn’t abstract. It shapes public sentiment, troop morale, and geopolitical outcomes.
The Cognitive War Machine
It’s been estimated that Russia spends anywhere from $2 billion to $6 billion annually in global information operations, depending on whose assessments you are reading. (Russian sources tend to reveal years-old or lower figures; Ukrainian and other international OSINT sources have estimated higher.)
Over the years, Russia’s tactics have evolved. Info ops are no longer outdated propaganda leaflets, they’re AI-powered, scalable, multilingual campaigns. Strategic ones, aimed at eroding trust, dividing societies, shaping opinions, and driving actions. China goes even further, building its own influence platforms (like TikTok) and investing heavily in large language models (LLMs).
Russia’s cognitive playbook is based on exploiting emotional triggers. In Ukraine, their key tactics include:
Hopelessness and isolation
They promote the idea that Ukraine is isolated and unsupported: “The West abandoned us”, “They don’t care about ordinary people.”
These narratives, pushed via fake media and pro-Russian channels (Telegram groups, bots in comment sections, etc.), aim to foster resentment toward allies and create the illusion that Ukraine stands alone. (Conversely, Russians also use propaganda to stoke support among Russians.)
Fear and panic
They spread fake alerts and deepfake videos to generate a sense of imminent danger: “Endless threats call for immediate evacuation”,“Surrendering to Russia is the safest option.”
As an example, in April 2024, Kharkiv residents received fake text messages, allegedly from government sources, urging them to flee due to an impending Russian encirclement. These messages were part of a psychological operation aimed to incite mass panic and disrupt societal stability.
Grief and Despair
They target families of soldiers with false narratives: claims that Ukrainian commanders hide casualty figures, alleged private messages between military spouses suggesting withheld compensation.
For soldiers, the disinformation goes deeper: “Your commanders have left you to die”, “This war is unjust — you're just cannon fodder for Western money”, “Entire battalions are being wiped out, and no one cares.”
Together with the general societal anxiety amidst the full-scale war, these seeded narratives are the small cuts of a bleeding country. How many can a society withstand, when grief, despair, and distrust are rising naturally?
It’s not only Ukraine that is vulnerable — the entire democratic world is. The core issue is twofold: a deep underestimation of the importance of informational warfare, and an unwillingness to cross traditional red lines in defense of the cognitive front. But that’s a false balance — and one that must be challenged.
Ukraine’s Center for Countering Disinformation, backed by the EU Advisory Mission, just released a bluntly titled handbook Weapons of Information Warfare. It’s a field guide to how today’s influence ops really work: fear-baiting, bot swarms, deepfakes, fake experts, manipulated values. Designed for frontline resilience, it’s exactly the kind of practical intel EU policymakers, NGOs, and media should study and adapt to be resilient in hostile info space.
Red line debates
In June 2017, German police first raided the homes of 36 people accused of hate speech on social media. Years later, these raids still face criticism in some foreign media as potential violations of free speech. Yet German authorities were clear. If you can’t say it standing in a town square without prosecution, you shouldn’t be able to say it online either. In 2025, a nationwide operation targeting suspected authors of online hate speech and incitement was launched. More than 170 operations are planned, coordinated by the Federal Criminal Police Office (BKA). The suspects are accused of incitement to hatred and insulting politicians, among other things. The investigations focus on far-right statements made online.
Virtual space is not a loophole. The consequences should be the same, along with clear rules for transparency around toxic information spreaders, accountability, and imposing costs on adversaries.
The good news: the playbook is known
While Russia and others have spent years investing in information warfare and offensive strategies, their playbook remains largely the same. What’s changed are the tools — e.g. generative AI made the delivery and production of disinformation easily scalable and faster. The question is how can democracies fight back without compromising their values?
Make the adversary’s job harder
The first step is to raise the cost of running information operations. That means actively dismantling adversarial infrastructure — from bot farms and crypto-mining networks to anonymous propaganda accounts. Cyber operations should be directed at the coordination hubs behind these campaigns, while de-anonymization and digital counterintelligence must be used to track these hostile networks, expose the people behind them and make it harder for them to act with impunity.
Build internal resilience
Resilience starts at home — through public awareness, media literacy, and strong, trusted local communication networks. Each NATO country must audit its own societal vulnerabilities: which regions or communities are most susceptible to disinformation, where adversaries can gain leverage with minimal effort, and what real social grievances or injustices are being exploited to insert propaganda. Because disinformation always begins with a kernel of truth — and a real pain point is what makes the lie persuasive. These vulnerabilities need to be addressed not with slogans or bureaucratic reports, but with targeted, human communication and action.
Talk to people the way they listen
Strategic communication is not about issuing press releases, but meeting people where they are, using the platforms and voices they trust. That means engaging local influencers, community leaders, and faith-based networks.
Latvia and Lithuania proved this approach during the COVID-19 pandemic, mobilizing local influencers creators and trusted local figures to explain vaccine importance in a language their communities would actually hear.
If your community listens to a TikTok blogger — she’s part of your civil defense system. Because when people lack trusted voices, the vacuum is quickly filled by the adversary. At info war, every one becomes a communicator and must be trained respectively.
Cognitive resilience goes far beyond recognizing fake news to rebuilding trust between people, institutions, and society itself. Democracies must act quickly, not just to regulate, but to strategically adopt. Building sovereign LLMs in Europe is crucial, as they provide the foundational infrastructure for next-generation technologies. Strategic communication must become proactive again, focused on reducing fear and rebuilding trust across all levels — from hyper-local communities to entire regions. And that demands serious, systemic investment — in media, in education, in communication infrastructure, and in localised, people-centred messaging.
Olha Danchenkova is a strategic communications professional, co-founder and CCO of Calibrated, a Ukraine-born communications agency with a global focus, working with clients in defence tech, cybersecurity and cognitive security. Regularly shares lessons on Ukrainian cognitive resilience and strategic communications as a speaker (Cognitive Warfare Course by NATO CoE, EUvsDisinfo Conference) and author (TechPolicy, Euronews). Danchenkova previously led the Brand, Marketing and Communications department at EY in Ukraine, responsible for implementing the company’s marketing/PR strategies and go-to-market activities. She is also the co-founder of the PR Army NGO, which connects Ukrainian war witnesses and experts with international media.