Spilnota Detector Media

Fake about an “obese Ukrainian officer”: AI-generated content is being used to discredit aid to Ukraine

On social media, particularly in the Polish segment of Facebook, alleged photos of an overweight man in a Ukrainian military uniform are being actively shared. The authors of these posts claim that he is an officer responsible for mobilization in the Ukrainian army, and that his appearance supposedly proves that international aid “does not reach those who truly need it”. Such posts have attracted hundreds of comments and shares. In reality, no such person exists, and the image was generated by artificial intelligence with the aim of fueling anti-Ukrainian sentiment. The fake was debunked by Polish fact-checkers from Demagog.

The image comes from a video that originally appeared on TikTok. A closer look at the clip reveals signs of AI generation: for example, the chair blends into the background, the shadows on the clothing “move” unnaturally, and the positioning of the character’s hands looks artificial. In the video, the supposed “officer of the Zhytomyr Territorial Recruitment Center” says, “Our team is against a ceasefire”. At the same time, the audio is out of sync with the movements of the lips in the footage.

To confirm this, Demagog fact-checkers analyzed the frame using the specialized tools Hive AI and Sightengine – both indicated a probability of over 90% that the content was generated by artificial intelligence.

The TikTok account that posted this video is filled with similar content of a pro-Russian and anti-Ukrainian nature. The profile description contains a link to a closed Telegram channel called “MATRYOSHKA”. When attempting to join it, users are asked provocative questions, such as “Whose is Crimea?”, which indicates the channel’s propagandistic nature.

Posts featuring this image gained significant traction: one of them received more than 1,000 reactions and over 200 shares. In the comments, many users perceive the photo as real. One commenter wrote: “This person is sick, and only people like this serve in the Ukrainian army, because the healthy and strong are in Poland”. Another added: “This war is strange—they stuff themselves with food, relax at resorts, drive luxury cars, carry money in shopping bags, get positions without rights, and want to be in our government”.

Such fakes are aimed at undermining trust in Ukraine and spreading anti-Ukrainian narratives within Polish society.

A fake AI-generated video of a “Ukrainian soldier” was created using the face of a Russian blogger

In early November, a video circulated on the Georgian segment of Facebook showing a young man in military uniform. He is crying, claiming that he is being forcibly sent to the front, that he does not want to die, and urging viewers to share the video. The accompanying caption reads: “Clowns are forcing young people to fight”. Many commenters perceived the person in the video as a Ukrainian soldier. This fake was debunked by Georgian fact-checkers from MythDetector.

In reality, this is an AI-generated video containing a manipulative appeal related to military assistance. It features errors characteristic of artificial intelligence, and its original source is a TikTok account that had previously uploaded similar fake content.

The video shows technical flaws typical of AI-generated material. The speaker’s face and tears appear artificial: the image is overly smooth, and the tears on his chin and nose shine unnaturally. If you look closely at the glasses, in some places they seem to merge with the face.

In addition, there is a patch on the uniform where the soldier’s surname should be written, but instead it displays blurred, unreadable symbols – a common issue in AI-generated imagery.

According to the caption, the video originated from the TikTok account fantomoko, which has since been blocked. Myth Detector fact-checkers analyzed similar videos previously shared by this user. For example, in one AI-generated video that circulated in Russian- and Georgian-language communities, a “Ukrainian soldier” claimed he was 23 years old and had been forcibly mobilized.

As of November 4, the fantomoko account contained several similar pieces of content. Some depicted “Ukrainian soldiers”, while others showed figures wearing uniforms with a Russian flag. The themes varied – from tearful complaints about being forced into war to a scene in which a person holds a Ukrainian flag near the Kremlin.

These videos were also investigated by the Italian fact-checking platform Open, which identified the person whose face was used for the AI generation. It turned out that one of the “characters” whose face most frequently serves as a template is a Russian video blogger with the nickname kussia88, known on YouTube as KASHA STREAM. A comparison of facial features confirms that it was his likeness that was used in the fake video.

Russian propagandists and networks affiliated with them spread such AI-generated fakes to manipulate public opinion, especially in countries where there is potential skepticism about supporting Ukraine.

Russian propaganda has flooded TikTok with AI fakes about the “encirclement of Kupiansk”

Russian information-psychological operations (IPSO) farms have launched a series of TikTok videos in which AI-generated “Ukrainian soldiers” claim that Kupiansk is fully encircled and that there is a critical shortage of ammunition.

The Center for Countering Disinformation under Ukraine’s National Security and Defense Council has refuted this information, stressing that it is a deepfake. “These videos are fake and were created using AI. The use of generative technologies is easy to detect due to unnatural facial expressions and a stereotyped voice. Another characteristic sign is that the fake ‘soldiers’ mispronounce the name Kupiansk, placing the stress on ‘ya’ instead of the correct ‘u’,” the fact-checkers said.

“As of now, there have been a number of counterattacks in the northern part of the city, and the Russians have been pushed back, but the situation remains unstable. Fighting is ongoing in the city and is changing dynamically. Although it has been possible to repel the Russians, the situation in the city is still difficult,” said Viktor Trehubov, Head of the Communications Department of the Joint Forces Grouping, during a TV broadcast.

The fake video about the “disappearance of half a brigade” was created using AI

A video is actively circulating on social media, particularly on TikTok, in which a woman emotionally claims that her husband, a serviceman, has gone missing and that the command of the Armed Forces of Ukraine is completely covering up the situation. She also alleges that half of the personnel in his brigade have supposedly disappeared. This is a high-quality video fake (deepfake) created using artificial intelligence technologies. The fake was debunked by experts from VoxCheck.

VoxCheck experts conducted a thorough analysis of the video, which confirmed that it is not authentic:

  • Facial expressions and movement: The “soldier’s wife” shows typical signs of AI-generated content. During the speech, almost only the lips move, while the overall facial expression remains unnaturally static.
  • Voice: The audio track sounds overly even and has a characteristic robotic tone, which is common in synthesized speech.
  • Technical confirmation: A check using the specialized tool Sensity.AI confirmed that the clip is AI-generated with a 99% probability. The program also identified a lipsync manipulation – artificially matching lip movements to a synthesized voice.

The TikTok account that posted this fake video is systematically used to spread AI-generated content aimed at discrediting Ukraine’s Defense Forces.

Propagandists rely on the same manipulative themes and visual archetypes to provoke maximum public reaction and distrust toward military command:

  • “Soldiers’ wives complain” about missing servicemen and alleged indifference from commanders.
  • “Elderly people protest” against the presence of military units in their communities.
  • “Servicemen claim” that they were “abandoned” or “betrayed” by their command.

This video is part of a coordinated enemy information operation aimed at demoralizing society, undermining trust in military leadership, and spreading panic by using realistic but entirely fake “testimonies” generated by AI characters.

AI fakes on TikTok: propagandists forged “street interviews” with Ukrainians about mobilization and the end of the war

Fake street interviews are actively spreading on TikTok, allegedly showing ordinary Ukrainians sharing their views on the war, mobilization, or peace talks. Analysts from the Center for Countering Disinformation under Ukraine’s National Security and Defense Council have drawn attention to this.

These videos are created using AI, follow an identical style, and feature “respondents” who express only pessimistic narratives.

In the videos, the characters speak out against mobilization, call the war “pointless”, accuse the authorities of “derailing peace”, and promote ideas about the “unnecessity” of returning the occupied territories.

Checks using deepfake detection services confirm that all of the videos are entirely generated by artificial intelligence. This is yet another case of Russia using AI in its information war against Ukraine.

When a person sees dozens of “ordinary Ukrainians” repeating the same messages, they subconsciously perceive it as a social norm (“everyone thinks this way”). The propagandists’ goal is therefore to show that “everyone around” is already tired, no longer believes in victory, and opposes mobilization – creating an illusion of mass pessimism and isolating those who continue to resist.

A video fake discrediting Ukrainian soldiers and police officers

On social media, particularly on Facebook, AI-generated videos are being actively spread with the aim of undermining the reputation of the Ukrainian Armed Forces, the police, and the Territorial Recruitment and Social Support Centers (TCCs). These video fakes were identified by fact-checkers from MythDetector.

An anonymous Ukrainian-language Facebook account called “Pravda TV” is posting videos that allegedly show Ukrainian servicemen, police officers, and representatives of Territorial Recruitment and Social Support Centers kissing men.

These videos quickly gained traction, but a detailed analysis showed that they were created using artificial intelligence (AI) to manipulate public opinion.

Fact-checkers from MythDetector analyzed several of the clips using the AI detection tool by InVID and identified a number of indicators pointing to AI-generated content:

  • Unnatural lighting: In all the videos, the lighting is even and artificial, lacking natural shadows or reflections, which is typical of generated content.
  • Anomalies in details: On the forms allegedly used by TCC representatives, the abbreviation “TCC” is written in large yellow-and-black letters, which does not correspond to standard official forms. In addition, in some frames the characters’ faces and hands appear blurred or distorted – a classic sign of AI generation.
  • Blurred elements: Faces and limbs often look unclear and contain artifacts that would be impossible in real footage shot on a smartphone or camera.

In other videos posted on the same account, the Sora watermark is clearly visible. Sora is a text-to-video generation model created by the U.S.-based artificial intelligence research organization OpenAI (the developer of ChatGPT). The Center for Countering Disinformation has previously warned about the spread of video fakes created with Sora that discredit mobilization, circulating across various social media platforms.

The “Pravda TV” account was created on July 4, 2022. According to Facebook’s “Page Transparency” section, it is operated from the territory of the Czech Republic. It is an anonymous profile that regularly publishes content aimed at criticizing the Ukrainian authorities, particularly the Territorial Recruitment and Social Support Centers. Many of the posts include AI-generated videos, indicating a systematic disinformation campaign.

Such accounts are often part of a broader network that spreads propaganda, disregards facts, and manipulates emotions.

Fake video: Stefanchuk allegedly promised heating only to families whose husbands have “paid their debt to Ukraine”

A fake video is being spread on social media in which the Speaker of the Verkhovna Rada of Ukraine, Ruslan Stefanchuk, allegedly says that in winter, due to gas and electricity shortages, heating will be provided only to families whose husbands have “paid their debt to Ukraine.”

This was flagged by fact-checkers from the VoxCheck project.

In reality, this is a deepfake. Stefanchuk never made such statements. The video was created using artificial intelligence.

A reverse image search on Google showed that the video was actively distributed by pro-Russian resources. No original video featuring Stefanchuk against this background could be found.

A check using the Deepware service confirmed that the video was generated by a neural network.

Signs that the video was generated by AI include:

  • blurred teeth;
  • a mismatch between facial expressions and the audio track;
  • the “plasticine face” effect – facial muscle movements appear overly soft and unnatural.

The fake portrays Ukraine’s leadership (including the parliamentary speaker) as cynical and indifferent to people. This fuels the narrative that a “Kyiv junta is mocking its own population”, which is then echoed in Russian media and in occupied territories.

The fake also preemptively “explains” future blackouts by suggesting that “it’s not us bombing – it’s your own government punishing you for evading mobilization”. In this way, responsibility is shifted from the aggressor to the victim.

Video fake: a pro-Russian rally with the slogan “Putin is our best friend!” allegedly took place in Ukraine

In early October, two videos allegedly showing a pro-Russian rally in Ukraine were actively circulating on social media, particularly on Georgian- and Russian-language Facebook and TikTok accounts. The footage shows people with Ukrainian flags and wearing clothes in the colors of the national flag. The videos feature chants such as “Motherland, freedom, Putin!” and “Putin is our best friend!”

However, this is a video fabrication and blatant disinformation. In both cases, archival footage of pro-Ukrainian patriotic rallies was used, with an altered (fake) audio track added. This video fake was debunked by fact-checkers from MythDetector.

The investigation by MythDetector experts found that both videos are old and were manipulated through deceptive editing.

First video (Rally on Khreshchatyk):

  • Actual time and place: The footage showing people in similar clothing with Ukrainian flags was filmed on October 4, 2014, on Khreshchatyk Street in Kyiv.
  • What the event really was: It was a Peace March calling for an end to the war, organized by patriotic forces.
  • Original audio: In identical videos published on YouTube in 2014, rally participants were chanting “Ukraine! Ukraine!” and other patriotic slogans. There were no pro-Russian or pro-Putin chants at all.

Second video (Rally in Odesa):

  • Actual time and place: The footage was filmed in Odesa in March 2014 on Lanzheronivska Street, near the building of the Odesa Archaeological Museum.
  • What the event really was: It was a pro-Ukrainian rally in support of Ukraine’s unity and sovereignty.
  • Original audio: In the original 2014 videos, demonstrators were chanting in support of a united and free Ukraine.

Both fabricated videos were accompanied by the TikTok nickname @sasha1111z. This account systematically publishes fake videos about Ukraine in a similar manipulative style, some of which have already been debunked in the past.

The spread of such video fakes is a classic example of how propagandists take authentic footage filmed many years ago and overlay it with a fake audio track containing pro-Russian slogans in order to create the false impression that there are supposedly widespread pro-Russian sentiments among Ukrainian citizens.

A fake about a triple murder in Kraków: Polish police denied the involvement of a Ukrainian citizen

Messages are circulating on social media (Facebook and TikTok) about an alleged triple murder in Kraków that was supposedly committed by a Ukrainian. This information quickly gained traction online, sparking a wave of outrage and speculation. However, the Kraków police officially denied these claims, calling them fake. Fact-checkers from Demagog investigated how this disinformation originated.

What happened?

A video circulated on social media showing ambulances with sirens on driving through an intersection near the Galeria Krakowska shopping center. The authors of the posts claimed that on October 11, 2025, a triple murder had allegedly taken place in Kraków and that it was committed by a Ukrainian citizen. The messages said that a man with a semi-automatic pistol attacked a group of people, killing a 21-year-old man and two 19-year-old women. The perpetrator was allegedly arrested and charged with murder with particular cruelty.

A TikTok video on this topic garnered nearly 100,000 reactions, 900 comments, and more than 43,000 shares. In the comments, users expressed outrage, with some even blaming the Ukrainian community and referring to historical narratives and stereotypes. For example, one comment read: “A small Volhynia is slowly beginning”, while another claimed that “they will release him and he will disappear for a while like a grenade. Our governments have been based on Bandera since 1945, and to this day the parliamentary majority are Bandera scum!”

Police response

The Kraków Police Headquarters quickly responded to the spread of these rumors. On its official Facebook page, the police published a statement categorically denying the information about the murder:

“ATTENTION! Check the facts – information about a murder in Kraków is FAKE NEWS! A rumor is spreading online about an alleged triple murder in Kraków committed by a person of Ukrainian nationality. We categorically deny this information! No such incident occurred either in Kraków or in the Lesser Poland Voivodeship”.

The police urged citizens not to trust unverified sources, to check information through official channels, and to refrain from spreading fake news.

Why does this matter?

This case is an example of how disinformation can quickly spread on social media, causing panic and hostility. False reports about crimes, especially those allegedly involving foreigners, can reinforce stereotypes and provoke discrimination. According to a report by the Public Opinion Research Center (CBOS), in 2025, 38% of Poles expressed antipathy toward Ukrainians – 8% more than in 2024 and 21% more than in 2023.

The story about a “triple murder” in Kraków is a fake that was not confirmed by any official sources.

Manipulation Russian propagandists manipulated a video segment from a weather TV program

Russian propaganda Telegram channels are spreading a video in which ABC meteorologist Mike Rizzo, when a screenshot of a news story with Volodymyr Zelenskyi appeared on the studio screen by mistake, says: “This is not a storm, this is a little spinach”. In fact, this information is not accurate. In January 2024, during a broadcast, when the host was commenting on an approaching storm in the region, a spinach image indeed appeared on the screen, prompting Rizzo to joke, “This is not a storm, this is a little spinach”. However, propagandists distorted the context by replacing the image of spinach with a fake BBC news story. In this manipulated version, an image of Volodymyr Zelenskyi appeared alongside a fabricated claim that, supposedly, the Pentagon had stated that over a million Ukrainian soldiers had died.

However, such information is fake. In November 2024, in an interview with Kyodo News, Volodymyr Zelenskyi stated that the number of Ukrainian casualties on the front since the beginning of the full-scale invasion was much less than 80,000.

“Some recently in the American press reported that 80 thousand Ukrainians had died. But I want to tell you, no, it’s less. Much less”, the President said.

According to estimates by The Economist, based on leaks from Western intelligence agencies, by the end of November 2024, at least 60,000–100,000 Ukrainian military personnel could have died during Russia's full-scale invasion. Around 400,000 others were injured, making them unable to continue serving in the army.

Russian agitprop manipulates facts and distorts contexts to sow distrust in Ukrainian leaders and reduce international support for Ukraine. Such manipulations create an atmosphere of doubt and uncertainty, which, in turn, can weaken Ukraine’s ability to effectively counter Russia’s aggression.

Fake Fake video claims Ukrainian sniper killed pensioners near Pokrovsk

Russian anonymous Telegram channels are actively circulating a video claiming that a sniper from the Main Intelligence Directorate of Ukraine allegedly killed several pensioners near Pokrovsk. This is being presented as another fabricated ‘war crime’ by Ukrainian forces. In reality, this is a complete fake.  

The Center for Strategic Communications and Information Security reports that propagandists distorted the context, portraying Russian soldiers as ‘pensioners’ in their disinformation. The original video, published by Ukraine’s Ministry of Defense Intelligence, shows a person in camouflage carrying a water bottle in their left hand and a rifle in their right hand. This clearly identifies the individual as a combatant, not a civilian, as falsely claimed by Russian propaganda.  

This is not the first instance of such disinformation. Since early January, Russian propaganda resources have been spreading dozens of synchronized fake reports, aiming to manipulate public opinion. In the first half of January alone, over 600 fake messages were detected on Telegram, alleging ‘murders’ of civilians, prisoners, and the wounded, allegedly committed by Ukrainian forces.  

These disinformation campaigns are designed to undermine trust in Ukrainian military personnel and distort the reality of the war. A central tactic of Kremlin propaganda is to depict Ukraine’s armed forces as ‘criminals’ committing acts of violence against civilians. Propagandists seek to reverse the narrative, presenting Ukraine as the aggressor while framing their own war crimes, such as killing civilians and destroying infrastructure, as defensive actions.  

By spreading such fake stories, Russian propaganda also aims to weaken international support for Ukraine and justify their military operations by shifting the blame for violence onto Ukraine.

Manipulation Russian propagandists manipulate Ukrainian TV program segment

Russian propaganda Telegram channels are spreading a video claiming that a Ukrainian soldier allegedly took revenge on a police officer who had previously mobilized him. According to the video, the soldier reportedly gained the officer's trust, invited him for a drink, spiked his drink with sleeping pills, and assaulted him once the officer fell asleep.  

This claim is entirely false. Propagandists manipulated a segment from the Ukrainian TV program Ukraine Today, which aired on its YouTube channel on January 10, 2025. In the original segment, host Kateryna Nesterenko mentions that such a story had been circulating on TikTok. However, she explicitly states that apart from TikTok and certain Telegram channels, there is no evidence to corroborate the story and does not confirm its authenticity.  

Propagandists altered the video by cutting out the part where the host questions the story's credibility, presenting it as if the incident was real.  

Further investigation revealed that this manipulated video was circulated exclusively within the pro-Russian segment of the internet, with at least 14 propaganda Telegram channels sharing it.  

This manipulation is part of an ongoing effort by Russian propagandists to discredit the mobilization process in Ukraine. Similar tactics have been used before, such as the debunked claim that three employees of a Territorial Recruitment and Social Support Center were found dead in Odesa.

Fake Lies about Ukrainian hackers spreading fake news via WhatsApp to Americans regarding Ukrainian military successes

Russian propaganda Telegram channels are circulating a purported NBC News clip claiming that Ukrainian hackers allegedly hacked WhatsApp and began sending mass fake news to Americans about Ukraine's military successes and minimal losses in the Russia-Ukraine war. Propagandists mockingly comment that “when victory doesn’t happen in reality, Ukrainians decided to bring it closer in WhatsApp”.  

In reality, this information is false, as reported by VoxCheck. Using Google’s reverse image search, it was discovered that the video features Gadi Schwartz, a reporter from NBC News' Stay Tuned NOW program. The fake news used a snippet from a segment titled The Future of Everything published on NBC News’ official YouTube channel on December 4, 2024.  

While the original NBC News segment did mention a large-scale hacking attack on nearly all major U.S. communications companies, the report attributed the attacks to China, not Ukraine.  

Additionally, the segment includes an interview snippet with Chris Krebs, the former director of the U.S. Cybersecurity and Infrastructure Security Agency. In the actual NBC News report, Krebs comments on China's cyberattacks on U.S. telecommunications systems, not on any actions by Ukraine.  

This is not the first instance of Russian disinformation regarding NBC News. Previously, similar claims were made, alleging that an American official admitted on NBC News that U.S. intelligence data on Russia was mostly fabricated - a claim also proven false.

Fake Eight cottages of Ukrainian Generals allegedly burnt in Los Angeles

Propagandists are spreading information in anonymous Telegram channels and media outlets about a fire in Los Angeles, where, according to them, eight cottages belonging to Ukrainian generals worth $90 million were allegedly burned. However, this is a fake. This fabrication was accompanied by a false video and a quote attributed to the commander of Ukraine's Ground Forces, Mykhailo Drapatyi. The propagandists imitated the style of the United24 media in this fake. It also mentioned that the generals had purchased the properties using funds provided to Ukraine by Western partners.

Analysts from StopFake emphasized that neither United24 nor other reputable outlets had published such information. Russian media used real footage of a fire in Los Angeles, manipulating it to create a false narrative. General Mykhailo Drapatyi never made the statements attributed to him by the propagandists.

These types of information attacks are aimed at discrediting the Ukrainian military leadership by spreading the notion of corruption within the Ukrainian Armed Forces. Additionally, the propaganda is intended to weaken trust in Ukraine among Western partners who provide military and financial assistance. The fake also emphasized the ‘injustice’ to further influence the emotions of the audience and generate a negative perception of Ukrainian leadership. Such information operations are part of Russia's broader strategy to manipulate international opinion and undermine support for Ukraine.

Fake Video fake: Ukrainian schoolgirl in New York complains about black classmates and claims U.S. taxpayers should cover her tuition

Russian media outlets are circulating a video in which a Ukrainian schoolgirl, allegedly living in New York, complains about her new school. The girl supposedly claims that she had difficulty adjusting to the “large number of Black classmates”, who allegedly offered her drugs, and that lessons were frequently canceled due to shooting threats. At the end of the video, she claims she transferred from a public to a private school, noting that while tuition is expensive, it is covered by American taxpayers - a situation she considers entirely fair.

In reality, the video was fabricated by propagandists, according to the StopFake project.

In late December 2024, Voice of America released a short video story about Ukrainian schoolchildren who relocated to New York due to the full-scale war and had to adapt to a new learning environment.

The girl featured in the video is named Sofiia Holinei, and she attends St. George’s Academy, a private school. However, the segment where she allegedly complains about Black classmates, drug dealers, shootings, and claims her education should be free was created using artificial intelligence - such remarks are absent in the original story. Propagandists manipulated audio to fabricate this segment and supplemented it with stock footage that is not part of the authentic video.

Previously, we debunked a fake claim that Ukrainian children were allegedly being beaten in Polish schools for speaking Ukrainian.

Fake False claim: Putin announced troop withdrawal from Ukraine and agreed to pay reparations

A video circulating online alleges that Russian President Volodymyr Putin announced the end of the so-called ‘special military operation’ (SVO) and the withdrawal of Russian troops from Ukraine. In the video, Putin reportedly claims to have achieved his objectives and states that the West has provided Russia with all necessary security guarantees. The posts also claim that Putin has agreed to pay reparations to Ukraine.  

However, this video is disinformation, as reported by the VoxCheck project. A reverse search on Yandex revealed that the footage used in the video was taken from Putin’s speech at a ceremonial meeting marking the 220th anniversary of the Russian Ministry of Justice, held on September 20, 2022. In that address, Putin discussed the operations of prisons, the establishment of correctional facilities for convicted individuals, and plans for new prison placement schemes across the country.  

In the fake video, it is noticeable that Putin's lip movements do not match the audio, indicating the use of artificial intelligence-generated dubbing. The Hive Moderation tool, designed to detect AI-generated content, confirmed with 91% probability that the audio was AI-generated.  

Previously, similar fake statements attributed to Putin have been debunked, including one claiming that the number of Ukrainians living in temporarily occupied territories and Russia equals the population remaining in Ukraine.  

This is yet another example of misinformation aimed at misleading audiences about Russia's stance on the war in Ukraine.

Fake Fake video about women in Kyiv protesting against priority lists for prisoner exchange

Russian propaganda sources are spreading a fake video allegedly from UNITED 24, claiming that women in Kyiv protested demanding the cancellation of priority exchange lists for fighters from “nationalist battalions”, which are keeping other servicemen in captivity longer. The video also includes footage of MP Oleksandr Kunytskyi, who allegedly advised women to “return to their women’s affairs” and raise children, and to “leave the decision on the exchange to the leadership of the Armed Forces of Ukraine”.

However, UNITED 24 did not publish such a video, and Oleksandr Kunytskyi did not make such statements. Once again, propagandists created a fake video to spread Russian narratives by using authoritative sources.

Russian propaganda systematically uses disinformation aimed at manipulating public opinion and accusing Ukraine, for example, of disrupting or delaying negotiations on the exchange of prisoners of war. In addition, it discredits representatives of the Ukrainian government in order to undermine trust in the political elite among Ukrainians and create unstable socio-political sentiments.

Fake Fake video that Ukraine “destroyed” more cultural monuments than the radical Taliban movement in Afghanistan

A video is being circulated online, purporting to be from UNESCO. It claims that Ukraine has destroyed more monuments than the Taliban in Afghanistan - 5,400 monuments to cultural figures and cultural heritage sites over two years. The Director of the UNESCO World Heritage Center, Mechtilda Rössler, allegedly stated in this regard: “Such an attitude towards monuments and history is typical of terrorist regimes, but not of European civilization”.

However, UNESCO did not publish such a video on its official resources. Mechtilda Rössler is no longer the director of UNESCO and did not make such a statement. The video was created from photos from open sources, not related to each other. And besides, Mechtilda Rössler is no longer the director of the World Heritage Center.

Using reverse search, it was possible to find out that propagandists used photos from open sources for the fake video. In particular, photos with Taliban militants were published on the official website of the AFP publication in August 2024.

The Taliban is a political and military group whose goal is to liberate Afghanistan from foreign military presence. The movement is made up of many different tribes and peoples inhabiting the country, with different views, including religious ones.

The movement's ideology was based on the religious movement of Islam, Ash'arism, which is part of the Sunni tradition. The Taliban's political program was based on calls to organize life in Afghanistan on the basis of the norms of Islamic law - Sharia - in its radically traditional interpretations. According to the Taliban, anyone who contradicts the ideology of Ash'arism is subject to persecution.

Their rule was characterized by religious intolerance towards non-believers and cruelty - for example, thieves had their hands cut off.

Fake Fake video of Euronews story with mocking photos of Zelenskyi during his meeting in Paris

A so-called excerpt from a Euronews story about Volodymyr Zelenskyi's visit to Paris is being circulated online. It allegedly broadcast mocking photos of the Ukrainian president, posted on Twitter with the hashtag #MerryChristmasEuronews2024.

But this is a fake video. Presumably, the fakers edited the video using the official Euronews video. There is also no publication on Twitter under the hashtag #MerryChristmasEuronews2024. Indeed, the publication published news about Volodymyr Zelenskyi's visit to Paris. And the publication contains the video used in the fake. Thus, the frames and captions (except for the Twitter feed) in the real video coincide with those in the fake video.

Euronews does indeed have a Twitter feed during live broadcasts. However, the channel does not broadcast posts from anonymous users. Instead, there are posts from Euronews itself, journalists, politicians, organizations, etc. In addition, there is no post on Twitter with the hashtag #MerryChristmasEuronews2024. Euronews also did not report anything about such a campaign on its official resources.

Let us remind you that in Paris on December 7, 2024, Volodymyr Zelenskyi met with Donald Trump. This was their first meeting after Trump's victory in the US presidential election on November 4. French President Emmanuel Macron also participated in the meeting, which lasted 35 minutes. He invited the politicians to the opening of the restored Notre-Dame de Paris Cathedral.

According to Reuters, during the half-hour conversation, Zelenskyi explained to Trump Ukraine's need for security guarantees to end a full-scale war with Russia. But the conversation did not include specific details about any vision for peace.

Read more: Propagandists about the meetings between Zelenskyi, Trump, and Orban.

Fake Fake that Olena Zelenska called on Ukrainians to “be less Ukrainian”

The network is spreading words that the First Lady of Ukraine, Olena Zelenska, allegedly said during the forum Ukraine 30. Healthy Ukraine in 2021: “Some Ukrainians are too Ukrainian. We need to become a little less Ukrainian, and then peace will come to society”.

However, in her speech during the forum Ukraine 30. Healthy Ukraine, Olena Zelenska did not say this. In particular, the First Lady spoke about the reform of the school nutrition system. And we could not find a mention of this quote during her other speeches. This quote was not distributed in Ukrainian or English in reliable media.

The first mention of the words can be found on the page of the user X with the nickname Mantelepa, the post was published back in June 2020, that is, before the forum. However, neither this nor the other posts had a video or link with a quote from Zelenska.

Fake Fake video that American TV show hosts “laughed” at story about Ukrainian Armed Forces losses

A snippet of a story on the American channel KMAX 31 is being shared online. In it, the hosts of Good Day allegedly laughed after a report about the colossal losses of the Ukrainian Armed Forces.

But this is a fake video. In the original story, the host laughed because of a mistake during the weather forecast. The fakers edited the video.

A search for the channel and show name revealed that the full name of the show is Good Day Sacramento. It is indeed broadcast on KMAX 31. If one checks the official Good Day Sacramento YouTube channel, one can find the same plot that the propagandists used for the fake video.

The original video was published back in December 2019. In it, the host laughs not at the supposedly colossal losses of the Ukrainian Armed Forces, but at a slip of the tongue during the weather forecast. Instead of “visibility” (translated from English - the distance one can see as determined by weather conditions), she said “disability” (translated from English - incapacity for work).

Fake Fake video of Ukrainian soldier claiming Russian captivity is “the best choice”

A video is being actively shared on TikTok, in which a Ukrainian serviceman allegedly talks about significant losses, criticizes the command, and claims that Russian captivity is the “best choice”. However, it is fake.

This is reported by the Center for Countering Disinformation at the National Security and Defense Council. Its experts found that this video is the result of the use of deepfake technology. The analysis shows the unnatural pronunciation of the character, and other videos on the page confirm that its content was created for manipulation.

The main goal of this fake is to undermine trust in the Armed Forces of Ukraine, create panic among citizens, and demoralize the military. TikTok was chosen because of its popularity and the speed of information dissemination. Such deepfakes mislead people if detailed analysis is not conducted. The Center is working with the TikTok administration to reduce the amount of disinformation spread by Russian propagandists on the platform.

Fake Fake video from an American show that allegedly mocks Ukrainian Armed Forces losses

Russian propagandists are spreading a video on anonymous Telegram channels and media outlets in which journalists from the Good Day Sacramento American show  allegedly mock the combat losses of the Ukrainian Armed Forces. However, it is a fake.

This is reported by the Center for Strategic Communications and Information Security and Ukrinform. It was Ukrinform experts who found out that the original video is dated December 2019. This is confirmed by the New Year decorations in the studio, identical to those that were in the show then. Modern episodes of the show use completely different decorations. In addition, the fake video does not have the large mask of the Grinch character, which was in the real show. The show itself has a purely entertainment format, where the hosts joke and cover local events, and do not discuss political or military topics.

Such actions once again demonstrate Russia's methods of information warfare, which include distorting reality to achieve propaganda goals. This fake news is aimed at spreading false messages about the Armed Forces of Ukraine, lowering the morale of the Ukrainian military, and demoralizing society.

Fake Lies about ABC News reporting one million dead and hundreds of thousands wounded Ukrainian soldiers

Russians are spreading a video with the logo of the American TV channel ABC, claiming that Ukraine allegedly lost about one million servicemen and had hundreds of thousands of people disabled due to injuries sustained in the war.

However, this video has been edited, reports the StopFake project. Using Google's reverse search function, it was possible to find the original video that was used to create this fake. It was an ABC News video from March 30, 2023 about the assistance of American non-profit organizations to Ukrainian veterans who lost limbs in the war. It talks about the work of the charity Kind Deeds, which provides Ukrainian veterans with prosthetics and organizes rehabilitation in the United States. The propagandists cut out individual frames from the original video and, using artificial intelligence, completely forged the audio track, mimicking the voice of the program's host. The original video report does not mention any data on the number of Ukrainian soldiers killed.

On December 8, 2024, Ukrainian President Volodymyr Zelenskyi reported that Ukraine had lost about 43,000 soldiers killed since the start of the full-scale war. As for the number of people with disabilities, there is no up-to-date official statistics available today. However, in mid-2023, the American publication The Wall Street Journal reported that over one and a half years of war, about 50,000 Ukrainians had become disabled due to amputations.

Earlier, we debunked the claim that the irreparable losses of the Ukrainian Armed Forces allegedly had already exceeded 500,000 people.

Fake The lie that an unknown person drew a marker opposite the Verkhovna Rada building to launch a Russian strike

Propaganda sources claim that an unknown person allegedly left a marker near the Verkhovna Rada building to guide a Russian strike, accompanied by the inscription: “Putin, here”. As “proof”, propagandists have shared a video purportedly showing the individual who marked the spot.

In reality, this is another fake by the Russians, the VoxCheck project writes. Since the start of the full-scale war, the part of Mariinskyi Park leading to the Verkhovna Rada has been closed off. A regular citizen could not access the area to record such a video. Since February 24, 2022, only certain categories of individuals, such as government officials and parliamentary staff, have had access to the Verkhovna Rada’s premises. Someone among them could have theoretically recorded such a video.

However, two facts indicate that this is a fake. Firstly, public transport in the video goes to the government quarter, which is closed today. Secondly, the guards in the video are wearing medical masks, likely due to the COVID-19 pandemic, indicating that the footage was recorded before the full-scale invasion.

The markers themselves began to appear at the beginning of the full-scale war. Some of them turned out to be ordinary markers on buildings that existed before the invasion. Others were deliberately created by Russians or their supporters to destabilize Ukrainian society and sow chaos. That is, it was an information and psychological special operation (IPSO). In reality, Russian strikes are guided by coordinates, not by such markers.