Spilnota Detector Media
Detector Media collects and documents real-time chronicles of the Kremlin propaganda about the Russian invasion. Ukraine for decades has been suffering from Kremlin propaganda. Here we document all narratives, messages, and tactics, which Russia is using from February 17th, 2022. Reminder: the increasing of shelling and fighting by militants happened on the 17th of February 2022 on the territory of Ukraine. Russian propaganda blames Ukraine for these actions

On 21 February, on the 1458th day of the full-scale war, our editorial office recorded:

2732
Fake
816
Manipulation
775
Message
559
Disclosure
Русскій фейк, іді на***!

Fake about an “obese Ukrainian officer”: AI-generated content is being used to discredit aid to Ukraine

On social media, particularly in the Polish segment of Facebook, alleged photos of an overweight man in a Ukrainian military uniform are being actively shared. The authors of these posts claim that he is an officer responsible for mobilization in the Ukrainian army, and that his appearance supposedly proves that international aid “does not reach those who truly need it”. Such posts have attracted hundreds of comments and shares. In reality, no such person exists, and the image was generated by artificial intelligence with the aim of fueling anti-Ukrainian sentiment. The fake was debunked by Polish fact-checkers from Demagog.

The image comes from a video that originally appeared on TikTok. A closer look at the clip reveals signs of AI generation: for example, the chair blends into the background, the shadows on the clothing “move” unnaturally, and the positioning of the character’s hands looks artificial. In the video, the supposed “officer of the Zhytomyr Territorial Recruitment Center” says, “Our team is against a ceasefire”. At the same time, the audio is out of sync with the movements of the lips in the footage.

To confirm this, Demagog fact-checkers analyzed the frame using the specialized tools Hive AI and Sightengine – both indicated a probability of over 90% that the content was generated by artificial intelligence.

The TikTok account that posted this video is filled with similar content of a pro-Russian and anti-Ukrainian nature. The profile description contains a link to a closed Telegram channel called “MATRYOSHKA”. When attempting to join it, users are asked provocative questions, such as “Whose is Crimea?”, which indicates the channel’s propagandistic nature.

Posts featuring this image gained significant traction: one of them received more than 1,000 reactions and over 200 shares. In the comments, many users perceive the photo as real. One commenter wrote: “This person is sick, and only people like this serve in the Ukrainian army, because the healthy and strong are in Poland”. Another added: “This war is strange—they stuff themselves with food, relax at resorts, drive luxury cars, carry money in shopping bags, get positions without rights, and want to be in our government”.

Such fakes are aimed at undermining trust in Ukraine and spreading anti-Ukrainian narratives within Polish society.

A fake AI-generated video of a “Ukrainian soldier” was created using the face of a Russian blogger

In early November, a video circulated on the Georgian segment of Facebook showing a young man in military uniform. He is crying, claiming that he is being forcibly sent to the front, that he does not want to die, and urging viewers to share the video. The accompanying caption reads: “Clowns are forcing young people to fight”. Many commenters perceived the person in the video as a Ukrainian soldier. This fake was debunked by Georgian fact-checkers from MythDetector.

In reality, this is an AI-generated video containing a manipulative appeal related to military assistance. It features errors characteristic of artificial intelligence, and its original source is a TikTok account that had previously uploaded similar fake content.

The video shows technical flaws typical of AI-generated material. The speaker’s face and tears appear artificial: the image is overly smooth, and the tears on his chin and nose shine unnaturally. If you look closely at the glasses, in some places they seem to merge with the face.

In addition, there is a patch on the uniform where the soldier’s surname should be written, but instead it displays blurred, unreadable symbols – a common issue in AI-generated imagery.

According to the caption, the video originated from the TikTok account fantomoko, which has since been blocked. Myth Detector fact-checkers analyzed similar videos previously shared by this user. For example, in one AI-generated video that circulated in Russian- and Georgian-language communities, a “Ukrainian soldier” claimed he was 23 years old and had been forcibly mobilized.

As of November 4, the fantomoko account contained several similar pieces of content. Some depicted “Ukrainian soldiers”, while others showed figures wearing uniforms with a Russian flag. The themes varied – from tearful complaints about being forced into war to a scene in which a person holds a Ukrainian flag near the Kremlin.

These videos were also investigated by the Italian fact-checking platform Open, which identified the person whose face was used for the AI generation. It turned out that one of the “characters” whose face most frequently serves as a template is a Russian video blogger with the nickname kussia88, known on YouTube as KASHA STREAM. A comparison of facial features confirms that it was his likeness that was used in the fake video.

Russian propagandists and networks affiliated with them spread such AI-generated fakes to manipulate public opinion, especially in countries where there is potential skepticism about supporting Ukraine.

Andrii Pylypenko, Lesia Bidochko, Oleksandr Siedin, Kostiantyn Zadyraka, and Oleksiy Pivtorak are collaborating on this chronicle. Ksenia Ilyuk is the author of the project.