Detector Media collects and documents real-time chronicles of the Kremlin propaganda about the Russian invasion. Ukraine for decades has been suffering from Kremlin propaganda. Here we document all narratives, messages, and tactics, which Russia is using from February 17th, 2022. Reminder: the increasing of shelling and fighting by militants happened on the 17th of February 2022 on the territory of Ukraine. Russian propaganda blames Ukraine for these actions
On 31 December, on the 1406th day of the full-scale war, our editorial office recorded:
2732
Fake
816
Manipulation
775
Message
559
Disclosure
Americans killed a child in Luhansk: a former Polish presidential candidate spread a fake
Former Polish presidential candidate Seb Ross posted a claim on social media alleging that “Americans killed a seven-year-old Russian girl in Luhansk”. As “evidence”, he attached a photograph of a girl.
In reality, the image used by Ross is a stock photo – it even retains the watermark of the photo bank. Therefore, his claim has no connection to real events.
Seb Ross is known for his pro-Russian views. He has previously stated that, if he became president of Poland, he would visit Moscow, and he has also questioned the need to criticize Russia. The politician has repeatedly spoken in favor of Poland leaving the European Union.
Ross published the fake about a “Russian girl killed by Americans” after reports that a seven-year-old girl, Amelia Grzesko, who held Polish citizenship, was killed as a result of a Russian shelling of Ternopil. It is likely that in this way he is trying to deflect responsibility from Russia for the deaths of children and create a false “mirror” narrative.
Fake: Ads in New York promote bets on Zelenskyy’s “escape” from Ukraine
Russian propaganda media, Telegram channels, and bots on the social network X are spreading a video allegedly filmed in New York’s central square, Times Square. In the clip, one of the billboards displays an “advertisement” for the Polymarket platform, supposedly offering users a chance to bet on which country President Volodymyr Zelenskyy would allegedly flee to after Ukraine’s “capitulation”. This video is fake.
There is no such bet on Polymarket’s official website as the one attributed to the platform by Russian propagandists. Polymarket does accept other types of wagers, including on possible peace agreements or a ceasefire, but the scenario shown in the video does not exist on the platform.
In addition, the fake video features the logo of the American media outlet USA Today, but there is no such video on the publication’s official website or social media accounts. The propaganda material also claims that the clip is being broadcast on a billboard located on a building belonging to the Regal cinema chain. In reality, this screen shows only trailers for new films and posters for movies currently playing at the theater.
The original source of the fake was Ukrainian collaborator Oleh Tsaryov, who is hiding in Russia. In many versions of the fabricated video, a watermark with his name is even visible.
Thus, the propagandists edited the clip using video-editing software and replaced the real advertisement on the billboard, creating yet another piece of disinformation.
Ukraine is recruiting Filipinos for the war: the fake has been debunked
The Ministry of Foreign Affairs of Ukraine and the German Embassy in the Philippines have firmly denied claims by Russian Foreign Ministry spokesperson Maria Zakharova that Filipino citizens are allegedly being recruited to take part in the war against Russia on Ukraine’s side.
Zakharova claimed that an alleged recruitment scheme was operating in the Philippines through the U.S. company RMS International, which supposedly promised earnings of $5,000, training by U.S. instructors, and the issuance of Schengen visas at the German Embassy in Manila.
MFA spokesperson Heorhii Tykhyi told the Center for Strategic Communications that these claims are groundless fabrications.
“These statements have no factual basis whatsoever and do not correspond to reality. Ukraine and the Philippines have strong, friendly, and dynamic bilateral relations based on mutual respect, trust, and commitment to international law. This was confirmed just yesterday during a telephone conversation between the Presidents of the two countries,” the MFA representative said.
The ministry emphasized that such statements are another element of Russia’s systematic disinformation campaign. Previously, the Kremlin spread similar fakes, including claims about alleged “recruitment” in South Korea. Now Russian propaganda is targeting countries in Southeast Asia.
Russians fabricated a fake about recruiting Koreans into the Armed Forces of Ukraine
Kremlin media and social media users are spreading claims that Ukraine is allegedly recruiting citizens of South Korea to serve in the Armed Forces of Ukraine. As “evidence”, the propaganda shows photos of posters supposedly put up by the Ministry of Foreign Affairs of Ukraine on the streets of Seoul.
The propaganda materials claim that the ads call on Koreans to join the Ukrainian army and promise a “special right to obtain Ukrainian citizenship”. Propagandists assert that this allegedly indicates “panic” in Kyiv.
The author of the circulated photographs is said to be Alan Kellow, a contributor on the Medium platform, where anyone can publish. He claimed that he had allegedly seen such posters while walking around Seoul.
The photos show contact details belonging to the Embassy of Ukraine in the Republic of Korea. At the same time, there is no information on the embassy’s official resources about recruiting foreigners. Fact-checkers contacted the embassy for clarification and received a response confirming that these materials are fake.
“We inform you that neither the Ministry of Foreign Affairs of Ukraine nor the Embassy of Ukraine in the Republic of Korea distributed such leaflets,” the diplomatic mission emphasized.
The Ministry of Foreign Affairs of Ukraine also confirmed that the posters are fake and are part of another Russian disinformation campaign.
In response to an inquiry from the Center for Strategic Communications, MFA spokesperson Heorhii Tykhyi stated that Ukraine has no connection whatsoever to the distribution of such materials, and that similar provocations have recently been recorded in other countries as well. According to him, this indicates a systematic attempt to discredit Ukraine abroad.
Chipping and “parties of shame”: propagandists spread new fakes about the Armed Forces of Ukraine
Russian propaganda media and Telegram channels are spreading an edited video allegedly taken from a broadcast of the Ukrainian TV channel “Kyiv”. In the clip, a man presented as a serviceman proposes “chipping” all mobilized Ukrainian men in order to prevent unauthorized absence from their units. In reality, this is a fabrication.
The propagandists altered the image of the program’s guest using graphic editing tools and overlaid a fake audio track. The fake was shared, among others, by the Russian so-called “opposition figure” Yuliya Latynina.
In the original news broadcast on November 11, the host was speaking with Dmytro Kukharchuk, Deputy Commander of the 3rd Army Corps of the Ground Forces of the Armed Forces of Ukraine. The discussion focused on the issue of unauthorized absence from units and the situation in the Kupiansk direction, but no statements about “chipping” servicemen were made. The fake video features a completely different person.
At the same time, Russian Telegram channels are spreading a photo allegedly from a Ukrainian school, where a so-called “desk of shame” was supposedly installed for children whose parents went AWOL.
This image is also fake – it was generated by artificial intelligence. This is confirmed by AI-detection services such as WasitAI, Reversely, and DecopyAI.
Both fabrications are part of another Russian information and psychological operation aimed at inciting conflict between the Ukrainian authorities, the military, and society.
Another Russian fake about the Ukrainian “Flamingo” missile has been debunked
A video is being actively circulated on social media that allegedly “exposes” the Ministry of Defense of Ukraine for using computer graphics, footage from old World War II films, and recordings of U.S. Tomahawk launches instead of real footage of the Ukrainian FP-5 “Flamingo” cruise missile.
Analysts from the VoxCheck project drew attention to this.
In reality, this is deliberate disinformation. Neither the Ministry of Defense of Ukraine nor the missile’s developer, Fire Point, has ever published any such video presentation.
The fake was created by propagandists themselves: they took a real video by The Wall Street Journal about the characteristics of the “Flamingo”, changed the color scheme from orange to blue, added footage from other sources, and generated the narrator’s voice using artificial intelligence.
Fact check:
There is no such video on the official resources of the Ministry of Defense of Ukraine (website, Facebook).
Fire Point does not have a website or social media pages, so publishing a video on its behalf is impossible.
According to the authors of the VoxCheck report, the audio in the fake video shows clear signs of AI generation: a monotonous, robotic voice, numerous grammatical mistakes, and phrases atypical for a native Ukrainian speaker (“velyko dalnosti” instead of “velykoi dalnosti”, “novoho pokolynna” instead of “novoho pokolinnia”, etc.). Analysis using the Hive Moderation service confirmed a high probability that the voice was synthesized.
Thus, instead of “exposing a fake”, the propagandists themselves created a fake in order to discredit Ukrainian weapons developments.
For three years now, Russian propaganda has been repeating the message: “Ukraine is poor, incapable, everything has been stolen”. When evidence to the contrary appears – new missiles, drones, electronic warfare systems – It destroys this narrative. That is why they have to shout “fake!” and produce so-called “exposés”, so their own audience does not start asking uncomfortable questions. While Ukrainian missiles (“Neptune”, RK-360MC, and now “Flamingo”) are in fact sinking Russian ships and striking Crimea and oil refineries, it is psychologically easier for Russians to convince themselves and their audience that “none of this really exists”, that it is all just “cartoons”.
Russia accuses Ukraine of “chemical attacks on Russian civilians” – the fake has been debunked
Russian propaganda continues a campaign of unfounded accusations against Ukraine over the alleged use of chemical weapons. Analysts from the Center for Countering Disinformation have drawn attention to this.
This time, Kyrylo Lysogorskyi, a representative of the Russian delegation to the Organisation for the Prohibition of Chemical Weapons (OPCW), claimed that the Armed Forces of Ukraine allegedly drop toxic substances from drones and “attack the civilian population of Russia”.
This is yet another fake with no evidence whatsoever. The OPCW and other international bodies have never recorded any cases of Ukraine using chemical weapons. Russia has been repeating these accusations for years, but each time it fails to provide samples, independent examinations, or even coordinates of the allegedly “affected” settlements.
On the contrary, it is Russian forces that systematically violate the Chemical Weapons Convention. Since the start of the full-scale invasion, hundreds of cases have been documented in which drones dropped grenades containing chloropicrin, tear gas, and other choking agents on positions of the Armed Forces of Ukraine. These facts have been confirmed by the OPCW, are being investigated as war crimes, and have become grounds for new sanctions against Russia.
When Russians drop chloropicrin, CS gas, K-51 grenades, and similar munitions on Ukrainian positions, they need a justification: “We weren’t the first – Ukraine started it”. This reduces the likelihood of a tough international response and new sanctions. The domestic audience hears: “Ukrainian Nazis are poisoning peaceful Russians with chemicals!” This fuels hatred and a willingness to fight “until everyone is destroyed”.
Fake about an “obese Ukrainian officer”: AI-generated content is being used to discredit aid to Ukraine
On social media, particularly in the Polish segment of Facebook, alleged photos of an overweight man in a Ukrainian military uniform are being actively shared. The authors of these posts claim that he is an officer responsible for mobilization in the Ukrainian army, and that his appearance supposedly proves that international aid “does not reach those who truly need it”. Such posts have attracted hundreds of comments and shares. In reality, no such person exists, and the image was generated by artificial intelligence with the aim of fueling anti-Ukrainian sentiment. The fake was debunked by Polish fact-checkers from Demagog.
The image comes from a video that originally appeared on TikTok. A closer look at the clip reveals signs of AI generation: for example, the chair blends into the background, the shadows on the clothing “move” unnaturally, and the positioning of the character’s hands looks artificial. In the video, the supposed “officer of the Zhytomyr Territorial Recruitment Center” says, “Our team is against a ceasefire”. At the same time, the audio is out of sync with the movements of the lips in the footage.
To confirm this, Demagog fact-checkers analyzed the frame using the specialized tools Hive AI and Sightengine – both indicated a probability of over 90% that the content was generated by artificial intelligence.
The TikTok account that posted this video is filled with similar content of a pro-Russian and anti-Ukrainian nature. The profile description contains a link to a closed Telegram channel called “MATRYOSHKA”. When attempting to join it, users are asked provocative questions, such as “Whose is Crimea?”, which indicates the channel’s propagandistic nature.
Posts featuring this image gained significant traction: one of them received more than 1,000 reactions and over 200 shares. In the comments, many users perceive the photo as real. One commenter wrote: “This person is sick, and only people like this serve in the Ukrainian army, because the healthy and strong are in Poland”. Another added: “This war is strange—they stuff themselves with food, relax at resorts, drive luxury cars, carry money in shopping bags, get positions without rights, and want to be in our government”.
Such fakes are aimed at undermining trust in Ukraine and spreading anti-Ukrainian narratives within Polish society.
A fake AI-generated video of a “Ukrainian soldier” was created using the face of a Russian blogger
In early November, a video circulated on the Georgian segment of Facebook showing a young man in military uniform. He is crying, claiming that he is being forcibly sent to the front, that he does not want to die, and urging viewers to share the video. The accompanying caption reads: “Clowns are forcing young people to fight”. Many commenters perceived the person in the video as a Ukrainian soldier. This fake was debunked by Georgian fact-checkers from MythDetector.
In reality, this is an AI-generated video containing a manipulative appeal related to military assistance. It features errors characteristic of artificial intelligence, and its original source is a TikTok account that had previously uploaded similar fake content.
The video shows technical flaws typical of AI-generated material. The speaker’s face and tears appear artificial: the image is overly smooth, and the tears on his chin and nose shine unnaturally. If you look closely at the glasses, in some places they seem to merge with the face.
In addition, there is a patch on the uniform where the soldier’s surname should be written, but instead it displays blurred, unreadable symbols – a common issue in AI-generated imagery.
According to the caption, the video originated from the TikTok account fantomoko, which has since been blocked. Myth Detector fact-checkers analyzed similar videos previously shared by this user. For example, in one AI-generated video that circulated in Russian- and Georgian-language communities, a “Ukrainian soldier” claimed he was 23 years old and had been forcibly mobilized.
As of November 4, the fantomoko account contained several similar pieces of content. Some depicted “Ukrainian soldiers”, while others showed figures wearing uniforms with a Russian flag. The themes varied – from tearful complaints about being forced into war to a scene in which a person holds a Ukrainian flag near the Kremlin.
These videos were also investigated by the Italian fact-checking platform Open, which identified the person whose face was used for the AI generation. It turned out that one of the “characters” whose face most frequently serves as a template is a Russian video blogger with the nickname kussia88, known on YouTube as KASHA STREAM. A comparison of facial features confirms that it was his likeness that was used in the fake video.
Russian propagandists and networks affiliated with them spread such AI-generated fakes to manipulate public opinion, especially in countries where there is potential skepticism about supporting Ukraine.
How Pravda promotes Russian propaganda in Spain
The Russian disinformation network Pravda is increasingly expanding its presence in the global information space, particularly through the Spanish-language segment – and this poses a new challenge not only for Europe but also for technological systems that many consider to be neutral to some extent. This is emphasized in the latest study by the ATHENA project, which focuses on how pro-Kremlin narratives are spreading in Spain and influencing artificial intelligence (AI).
The Pravda network, first identified in April 2022, has become, according to researchers from the ATHENA project, not merely a propaganda tool but a full-fledged disinformation machine. As noted in our article from March 12, 2025, the network comprised more than 150 domains across roughly 49 countries. According to NewsGuard, most of the network’s websites do not produce original content; instead, they aggregate, republish, or translate pro-Kremlin messages from Russian state media, Telegram channels, or official sources.
In the ATHENA report, researchers note that the Spanish-language Pravda integrates into the Spanish-speaking information space the same messages that Russia systematically promotes globally: claims of the “fascization of Ukraine”, the “decline of the EU”, and justifications for the war. All of these texts follow a common structural pattern – they present Russian propaganda materials as “expert opinions” or an “alternative point of view”, while portraying the site itself as a local outlet supposedly “fighting censorship”. This disguise, combined with its targeting of Spanish audiences, makes the network a dangerous instrument of influence.
The ATHENA project also found that, in addition to a general Spanish-language section, versions in Catalan, Basque, and Galician have appeared. All of them systematically disseminate content adapted to local audiences, largely based on Russian state media as well as anonymous Telegram channels. The simultaneous emergence of several language branches was not accidental: the first articles in the Catalan, Basque, and Galician sections were published at the same time and were virtually identical translations. This points to a centralized content factory that automates and scales disinformation for different audiences. The system operates as a multi-level chain: at its core are thousands of Telegram channels that generate the initial pool of materials, followed by aggregators and localized “Pravda” sites that instantly turn these signals into publications. Monitoring by Maldita.es, cited by ATHENA, shows that a large share of the network’s materials consist of reposted Telegram messages.
This disinformation operation has several strategic objectives. First, it aims to expand influence within Spain itself and across the entire Spanish-speaking world, since content published in Spain can easily spread to Latin America. Second, the media network seeks to build up a “propaganda footprint” online. NewsGuard found that one third of the responses produced by leading chatbots echoed narratives promoted by the Pravda network, and in many cases the models directly cited materials from this network, creating a risk that fake sources may be legitimized in AI-generated answers.
The practical consequences of this approach are also visible in the timing of the network’s activity spikes: Pravda synchronously increased its publishing volume during moments of crisis, when public attention was at its peak – for example, during the large-scale power outage that affected Spain and Portugal on April 28, 2025. On those days, the group published hundreds of posts, often hinting at cyberattacks or making outright false claims about the scale of the disruption, thereby amplifying panic or undermining confidence in the ability of state institutions to respond to the crisis.
ATHENA and Maldita.es traced how messages from Telegram appeared on the network’s websites within the first minutes, making these platforms an effective tool for the rapid spread of narratives during periods of uncertainty. A significant share of Pravda’s sources are outlets linked to Russian state propaganda – RT, Sputnik, RIA Novosti, and others. Around 40% of the network’s Spanish-language publications directly cited such sources; the rest mostly originated from Telegram chains that themselves frequently republish or rework material from these platforms.
For Ukraine and for European media, such activity can undermine trust in institutions and sow doubts about the effectiveness of policies or decisions, or function as a technological attack on the ways information is accessed through modern tools.
Andrii Pylypenko, Lesia Bidochko, Oleksandr Siedin, Kostiantyn Zadyraka, and Oleksiy Pivtorak are collaborating on this chronicle. Ksenia Ilyuk is the author of the project.