Spilnota Detector Media
Detector Media collects and documents real-time chronicles of the Kremlin disinformation about the Russian invasion. Ukraine for decades has been suffering from Kremlin disinformation. Here we document all narratives, messages, and tactics, which Russia is using from February 17th, 2022. Reminder: the increasing of shelling and fighting by militants happened on the 17th of February 2022 on the territory of Ukraine. Russian propaganda blames Ukraine for these actions.

On 21 January, on the 1062th day of the full-scale war, our editorial office recorded:

2644
Fake
796
Manipulation
757
Message
544
Disclosure
Русскій фейк, іді на***!

Fake Fake cover of The Nation magazine

Propagandists are circulating an image on social media that allegedly represents the cover of The Nation magazine with the headline: “Unequal duel. Volodymyr Zelenskyi failed to win - everyone is talking about Putin again. Including Zelenskyi himself”. However, this is not true.

This was reported by experts from the VoxCheck project. They found that the actual cover of the January 2025 issue of The Nation looks completely different. On the magazine's website, in the Archive section, all issues and their covers are available, but the one being shared online is not among them—neither on the official website nor on the publication's social media. The fake cover features images of Volodymyr Zelenskyi and Volodymyr Putin, along with the publication date—January 2025. However, according to the editorial team, this issue was finalized for publication by the end of 2024, and its cover does not reference the presidents of Ukraine or Russia.

The dissemination of a fake magazine cover with a critical headline aims to undermine Volodymyr Zelenskyi's reputation, portraying him as a weak leader unable to achieve victory or effectively engage in political confrontation. Such fabricated materials also attempt to suggest that international attention on Ukraine is allegedly fleeting or insignificant, contrasting it with the supposed constant focus on Russia and its leader, Volodymyr Putin. Spreading fake content resembling authoritative Western publications helps create the impression that Ukraine is not receiving the necessary support from its partners or that the international community is skeptical of its leadership. This could affect the morale of Ukrainians and their allies. Propagandists may use such fakes to create the perception that global attention is fixated on Putin rather than Zelenskyi or Ukraine, thereby reinforcing the notion that Ukraine’s struggle for independence lacks sufficient backing or prospects.

Fake Ukrainian soldiers allegedly stole a washing machine from a house in the Kursk region

Russian propagandists circulated a post on pro-Russian social media, purportedly from the charity foundation Lviv-Opir. The post claimed that soldiers from the 225th Separate Assault Battalion stole a washing machine from a house in the Kursk region. However, this is photo manipulation.

Experts from the StopFake project investigated the claim and found that the screenshot of the post was fabricated. They also located the original post on the Facebook page of the Lviv-Opir charity foundation. In the original, it is stated that Ukrainian soldiers received a washing machine from the Rak family. The foundation not only organizes collections but also facilitates the delivery of essential items to soldiers from concerned citizens.

In the fake publication, the Russians alleged that the washing machine was taken by fighters from the 225th battalion who were stationed in the Kursk region. However, the original post mentions servicemen from the Kharkiv region without specifying the unit. The legitimate post does not mention the theft of equipment; instead, it expresses gratitude for the assistance provided. Furthermore, the timestamps of the posts differ: the original was published on December 4, 2024, at 01:04 AM, while the fake appeared at 02:04 AM the same day. This time difference aligns with the time zones of Ukraine and Russia, suggesting that the fake was likely created by Russian users.

The spread of fake reports about thefts or misconduct by Ukrainian soldiers aims to discredit the Ukrainian Armed Forces. Such narratives create a negative image of the Ukrainian military both domestically and internationally, undermining support for their actions among citizens and allies. This disinformation may also seek to foster internal divisions in Ukraine, sowing doubt and mistrust among the population toward their military and volunteers. It can impact morale, trust in the government and the army, and reduce support from international partners.

This type of disinformation reinforces Russia's aggressive narrative and its denial of truth, attempting to portray Ukraine as a country where everything, from the government to the military, is uncontrolled or immoral. It serves to justify Russia’s aggressive policies and actions against Ukraine. Russian propagandists also use such disinformation to depict Ukraine as lacking unity, suggesting that Ukrainian soldiers engage in theft or behave uncontrollably. This could be an attempt to portray the situation in Ukraine as chaotic and undisciplined. Overall, such disinformation seeks to create misunderstandings, weaken trust in Ukraine and its military, and form a negative image of the country and its representatives.

Fake British professor allegedly called Zelenskyi a modern a vampire

Russian propagandists spread a video on social media, particularly on several pro-Russian anonymous Telegram channels, claiming that a professor from the University of Bristol spoke about the characteristics of mythical archetypes. According to the video, Volodymyr Zelenskyi is allegedly a modern embodiment of a vampire. However, this is fake.

Experts from the VoxCheck project drew attention to it. They found that the original video has a different format and content, and Professor Ronald Hutton does not mention Volodymyr Zelenskyi. The original audio track was altered using artificial intelligence. A check through the Hive Moderation tool showed with 99% probability that artificial intelligence was used to create this video. The fake video includes the university’s logo, which is not present in the original. The video on the university’s TikTok page contains English subtitles, but their format is different. For the forgery, other illustrations of vampires were used, combined with photographs of Volodymyr Zelenskyi. In the original video, the professor is mainly shown speaking, without images of mythical characters.

Propagandists spread such disinformation for several main reasons. Spreading fake videos portraying state leaders in an invented or negative light aims to diminish their authority. Such manipulations can create the image of a leader who is allegedly dangerous or inadequate, influencing public opinion both in Ukraine and abroad. By associating leaders with mythical archetypes, propagandists aim to create fear or disgust toward certain individuals or ideas. This can undermine trust in governments, organizations, or international partners supporting Ukraine. Using artificial intelligence to create fake materials allows for effective manipulation of reality. Since fake videos look quite convincing, they can make people believe that an authoritative scientist or figure actually said what is being claimed.

Orest Slyvenko, Artur Koldomasov, Vitalii Mykhailiv, Oleksandra Kotenko, Oleksandr Siedin, Kostiantyn Zadyraka, and Oleksiy Pivtorak are collaborating on this chronicle. Lesia Bidochko serves as the project coordinator, while Ksenia Ilyuk is the author of the project.