TikTok and Telegram: War fakes — risks in 2026
The year 2026 is likely to bring more realistic disinformation about the war — with videos, synthetic voices, and rapid replication of a single message across multiple platforms. The main threat is mass coordinated drops: when dozens of accounts simultaneously amplify the same narrative, turning it into a background reality in a very short time.
The year 2025 showed that TikTok provides emotional reach for such operations, while Telegram serves as the infrastructure for launching and coordinating them. What this means for 2026 and how to reduce personal risks — in Frontliner’s analysis.
Synthetic content will become more widespread. AI tools are already actively used to discredit, distort context, and create videos allegedly recorded by eyewitnesses. In 2026, the volume of such materials will increase, and their quality will continue to improve.
There will likely be more multilingual drops. A single storyline will be distributed simultaneously in Ukrainian and other languages to exert pressure both on domestic audiences and on international ones.
The Telegram–TikTok linkage will become tighter. Telegram will remain the space where fakes are launched, while TikTok will serve as the channel for mass emotional reach and normalization of the desired interpretation.
TikTok 2026: A focus on emotion and simple explanations
TikTok will continue to push content toward radically simplified conclusions: “betrayal” or “everything is lost.” This is a convenient format for manipulation around mobilization, casualties, fabricated internal conflicts, and trust in the military and the state.
For example, monitoring by the Institute of Mass Information shows that fakes and manipulations on TikTok are systemic rather than accidental.
Telegram in 2026: Infrastructure for coordinated drops
In 2026, Telegram is most likely to remain the platform with the highest level of anonymity and the fastest speed of copying materials between channels. Its role is not only to amplify content, but also to host the primary fake files, screenshots, and video fragments that are later presented as “evidence.”
At the same time, regulatory pressure on the platform within the EU under the Digital Services Act will continue. However, this does not guarantee equal impact on content in every country or for every type of channel.
What carried over from 2025 into 2026?
The year 2025 set three directions that will most likely intensify in 2026:
- demoralization (narratives that “everything is bad and will only get worse”);
- attacks on trust (undermining the reputation of institutions, the military, and volunteer initiatives);
- the external dimension (adapting messages for foreign audiences).
How to protect yourself from information threats on social media
- Pause before reacting: if content triggers sharp anger or fear, this is a reason to verify — not to share.
- Check the author and the original source: anonymous channels and accounts claiming “insider information” are a high-risk zone.
- Seek two reliable confirmations: cross-check with official statements, established media outlets, or fact-checking projects.
- Be skeptical of video “evidence”: in 2026, more videos may be generated or edited; pay attention to the date, location, and repeated uploads.
- Practice feed hygiene: unsubscribe from “constant alarm” channels, limit reposts from Telegram into chats, and control recommendations on TikTok.
The biggest risk of 2026 is not new fakes, but a new level of credibility: synthetic videos, mass coordinated drops, and multilingual scaling. If 2025 taught the ecosystem to work with speed and format, then 2026 will focus on visual persuasiveness.
In 2026, the key threat is coordinated narratives that look convincing and spread rapidly because they are supported by algorithms, anonymity, and increasingly accessible AI tools. The year 2025 demonstrated the basic scheme of this work on TikTok and Telegram, while 2026 will likely make it broader and more technologically advanced. This raises the bar for critical thinking and information verification.
Adapted: Kateryna Saienko
***
Frontliner wishes to acknowledge the financial assistance of the European Union though its Frontline and Investigative Reporting project (FAIR Media Ukraine), implemented by Internews International in partnership with the Media Development Foundation (MDF). Frontliner retains full editorial independence and the information provided here does not necessarily reflect the views of the European Union, Internews International or MDF.