Spilnota Detector Media
Detector Media collects and documents real-time chronicles of the Kremlin propaganda about the Russian invasion. Ukraine for decades has been suffering from Kremlin propaganda. Here we document all narratives, messages, and tactics, which Russia is using from February 17th, 2022. Reminder: the increasing of shelling and fighting by militants happened on the 17th of February 2022 on the territory of Ukraine. Russian propaganda blames Ukraine for these actions

On 19 February, on the 1456th day of the full-scale war, our editorial office recorded:

2732
Fake
816
Manipulation
775
Message
559
Disclosure
Русскій фейк, іді на***!

Video fake: a pro-Russian rally with the slogan “Putin is our best friend!” allegedly took place in Ukraine

In early October, two videos allegedly showing a pro-Russian rally in Ukraine were actively circulating on social media, particularly on Georgian- and Russian-language Facebook and TikTok accounts. The footage shows people with Ukrainian flags and wearing clothes in the colors of the national flag. The videos feature chants such as “Motherland, freedom, Putin!” and “Putin is our best friend!”

However, this is a video fabrication and blatant disinformation. In both cases, archival footage of pro-Ukrainian patriotic rallies was used, with an altered (fake) audio track added. This video fake was debunked by fact-checkers from MythDetector.

The investigation by MythDetector experts found that both videos are old and were manipulated through deceptive editing.

First video (Rally on Khreshchatyk):

  • Actual time and place: The footage showing people in similar clothing with Ukrainian flags was filmed on October 4, 2014, on Khreshchatyk Street in Kyiv.
  • What the event really was: It was a Peace March calling for an end to the war, organized by patriotic forces.
  • Original audio: In identical videos published on YouTube in 2014, rally participants were chanting “Ukraine! Ukraine!” and other patriotic slogans. There were no pro-Russian or pro-Putin chants at all.

Second video (Rally in Odesa):

  • Actual time and place: The footage was filmed in Odesa in March 2014 on Lanzheronivska Street, near the building of the Odesa Archaeological Museum.
  • What the event really was: It was a pro-Ukrainian rally in support of Ukraine’s unity and sovereignty.
  • Original audio: In the original 2014 videos, demonstrators were chanting in support of a united and free Ukraine.

Both fabricated videos were accompanied by the TikTok nickname @sasha1111z. This account systematically publishes fake videos about Ukraine in a similar manipulative style, some of which have already been debunked in the past.

The spread of such video fakes is a classic example of how propagandists take authentic footage filmed many years ago and overlay it with a fake audio track containing pro-Russian slogans in order to create the false impression that there are supposedly widespread pro-Russian sentiments among Ukrainian citizens.

A fake about a triple murder in Kraków: Polish police denied the involvement of a Ukrainian citizen

Messages are circulating on social media (Facebook and TikTok) about an alleged triple murder in Kraków that was supposedly committed by a Ukrainian. This information quickly gained traction online, sparking a wave of outrage and speculation. However, the Kraków police officially denied these claims, calling them fake. Fact-checkers from Demagog investigated how this disinformation originated.

What happened?

A video circulated on social media showing ambulances with sirens on driving through an intersection near the Galeria Krakowska shopping center. The authors of the posts claimed that on October 11, 2025, a triple murder had allegedly taken place in Kraków and that it was committed by a Ukrainian citizen. The messages said that a man with a semi-automatic pistol attacked a group of people, killing a 21-year-old man and two 19-year-old women. The perpetrator was allegedly arrested and charged with murder with particular cruelty.

A TikTok video on this topic garnered nearly 100,000 reactions, 900 comments, and more than 43,000 shares. In the comments, users expressed outrage, with some even blaming the Ukrainian community and referring to historical narratives and stereotypes. For example, one comment read: “A small Volhynia is slowly beginning”, while another claimed that “they will release him and he will disappear for a while like a grenade. Our governments have been based on Bandera since 1945, and to this day the parliamentary majority are Bandera scum!”

Police response

The Kraków Police Headquarters quickly responded to the spread of these rumors. On its official Facebook page, the police published a statement categorically denying the information about the murder:

“ATTENTION! Check the facts – information about a murder in Kraków is FAKE NEWS! A rumor is spreading online about an alleged triple murder in Kraków committed by a person of Ukrainian nationality. We categorically deny this information! No such incident occurred either in Kraków or in the Lesser Poland Voivodeship”.

The police urged citizens not to trust unverified sources, to check information through official channels, and to refrain from spreading fake news.

Why does this matter?

This case is an example of how disinformation can quickly spread on social media, causing panic and hostility. False reports about crimes, especially those allegedly involving foreigners, can reinforce stereotypes and provoke discrimination. According to a report by the Public Opinion Research Center (CBOS), in 2025, 38% of Poles expressed antipathy toward Ukrainians – 8% more than in 2024 and 21% more than in 2023.

The story about a “triple murder” in Kraków is a fake that was not confirmed by any official sources.

AI-generated videos as a tool to discredit mobilization in Ukraine

AI-generated videos created with artificial intelligence, including OpenAI’s Sora neural network, are being actively spread on social media platforms TikTok and X. These videos depict fabricated scenes in which Ukrainian soldiers allegedly serve draft notices to infants, check documents of pregnant women, or clash with civilians. The spread of such AI-generated videos has been warned about by the Center for Countering Disinformation.

What do these videos show?

The videos created using Sora are presented in a humorous style. However, their content is aimed at manipulating public opinion. They play on issues that are sensitive for Ukrainians, such as mobilization, war, and the defense of the state. For example, they depict scenes in which soldiers allegedly hand draft notices to infants or argue with civilians.

Such materials are often marked with the “Sora” logo, indicating their artificial origin. Sora is a generative artificial intelligence system capable of creating realistic videos based on text prompts. Although these videos may look convincing, they are entirely AI-generated.

The goal of the disinformation

Under the guise of humor, these videos promote anti-Ukrainian narratives aimed at undermining trust in the mobilization process. Such actions are part of an information war intended to sow panic, create divisions in society, and weaken support for the Armed Forces of Ukraine.

Andrii Pylypenko, Lesia Bidochko, Oleksandr Siedin, Kostiantyn Zadyraka, and Oleksiy Pivtorak are collaborating on this chronicle. Ksenia Ilyuk is the author of the project.