Misinformation involving the distribution of false, inaccurate or otherwise misleading information has been a prominent and ubiquitous feature of the Israel–Hamas war.[1] Much of the content has been viral in nature, with tens of millions of posts in circulation on social media. A variety of sources, including government officials, media outlets, and social media influencers across different countries, have contributed to the spread of these inaccuracies.[2]
The New York Times described the start of the Israel–Hamas war as releasing a "deluge of online propaganda and disinformation" that was "larger than anything seen before". It described the conflict as "fast becoming a world war online" and stated that Russia, China, Iran and its proxies had used state media and covert influence campaigns on social media networks to support Hamas, undermine Israel, criticize the United States and cause unrest.[3] James Rubin of the U.S. State Department's Global Engagement Center called coverage of the conflict as being swept up in "an undeclared information war with authoritarian countries".[3]
During the conflict, the Israeli government and Israeli cyber companies have deployed AI tools and bot farms to spread disinformation and spread graphic, emotionally charged and false propaganda to dehumanize Palestinians, sow division among supporters of Palestine by targeting Black lawmakers, and exert pressure on politicians to support Israel's actions.[4][5][6] The Intercept reported that: "At the center of Israel’s information warfare campaign is a tactical mission to dehumanize Palestinians and to flood the public discourse with a stream of false, unsubstantiated, and unverifiable allegations."[6] One such covert campaign was commissioned by Israel's Ministry of Diaspora Affairs. The ministry allocated about $2 million to the operation, and used political marketing firm Stoic based in Tel Aviv to carry it out, according officials and documents reviewed by the New York Times.[4] The campaign was started after the October 7 attack, and remained active on X (formerly Twitter) at the time of the New York Times report in June 2024. At the peak of the campaign it used hundreds of fake accounts posing as Americans on X, Facebook and Instagram to post pro-Israel comments, focusing on U.S. lawmakers, particularly those who are Black and from the Democratic Party, including Hakeem Jeffries, the House minority leader from New York, and Raphael Warnock, Senator from Georgia. ChatGPT was deployed to generate many of the posts. The campaign also involved the creation of three fake English-language news sites featuring pro-Israel articles.[4] In Nov 2024, in a UN published report the committee noted that western social media companies disproportionately removed content showing solidarity with the Palestinian people relative to content promoting violence against Palestinians.[7]
On 7 October 2023, deputy head of Hamas's political bureau, Saleh al-Arouri (based in Lebanon), claimed the Qassam Brigades had "captured senior officers from the occupation army" in the 7 October attacks.[8] A rumour circulated on social media that one of these officers was Major General Nimrod Aloni, the commander of the Israeli Depth Corps, based on a photograph of a man who resembled him being detained by unidentified armed men.[8] A Persian language post by the Israel Defense Forces (IDF) quoted a post about his capture from Tasnim News Agency and wrote "Tasnim: Distributors of fake news of IRGC" without either denying or confirming the capture of Aloni.[9][10] Aloni was subsequently seen on 8 October attending a meeting of top Israeli military officials.[11]
Islamic Republic of Iran Broadcasting published images of the capture of commanders of Nagorno-Karabakh by the Azerbaijani army in September 2023 as the capture of Israeli commanders by Hamas.[12][13][14]
A video of a CNN broadcast from near the Israel-Gaza border[15] with audio added to suggest the network had faked an attack went viral on social media.[16][17]
Social media accounts based in India have spread pro-Israeli disinformation, with influencers misrepresenting videos purported to show school girls taken as sex slaves, or Hamas kidnapping a Jewish baby. Fact-checker Pratik Sinha said the "Indian right-wing has made India the disinformation capital of the world".[18] The trend forms part of a wider pattern of fake news in India with an Islamophobic slant, including disinformation on Palestinians coming from the BJP IT Cell, a vehicle of India's governing party, the BJP.[18]
An Israeli boy and his sisters killed during Hamas's attack on Kibbutz Nir Oz on 7 October have been falsely accused of being "crisis actors".[19]
A photo shared by Israel showing the charred corpse of a baby was claimed by many on social media to have been AI-generated, based on AI detector "AI or Not". The claim was repeated by Al Jazeera Arabic. The company behind "AI or Not" later said that the result was a false positive caused by the image's compression and blurred name tag; several experts who looked at the photo found it to be genuine. Other social media users claimed, based on a 4chan post, that the image had been altered from a similar photo of a dog, though researcher Tina Nikoukhah found that the dog picture was likely "falsified using generative methods".[20][21][22]
The community volunteer paramedic and rescue group ZAKA began collecting bodies immediately after the Hamas attacks, while the IDF avoided assigning soldiers from Home Front Command with training to carefully retrieve and document human remains in post-terrorism situations.[23] Home Front Command soldiers and volunteers from other organizations accused ZAKA volunteers of spreading horror stories of atrocities that did not happen for self-promotion as well as "releasing sensitive and graphic photos to shock people into donating" and other unprofessional behavior.[23][24] The Times of Israel reported that "A volunteer from a different organization told Haaretz that ZAKA also double-wrapped bodies in its own bags after they had already been placed in IDF or other organizations’ bags, creating a mess at headquarters when bodies were placed in the wrong sections."[24] Haaretz reported this double-bagging was done for self-promotion purposes: "At the attack scenes, the question was not only what to photograph but also what exactly to show. In some cases, volunteers from Zaka were seen wrapping bodies already wrapped in IDF bags. The new bag prominently displayed the Zaka logo."[23]
The AP reported that two ZAKA volunteers, including its commander Chaim Otmazgin, made false statements about sexual violence and rape on October 7.[25] Otmazgin claimed he found a raped woman due to her pants having been pulled down below her waist. He showed photos to the AP as part of his testimony. However, this had been caused by her body being moved by a group of soldiers.[25] Yossi Landau, another ZAKA volunteer, claimed he found a pregnant woman killed with a fetus removed from her womb. This was also proven to be false.[25] When challenged, Landau offered to show Al Jazeera a photo on his phone of the stabbed foetus, but is filmed admitting he is unable to do so.[26]
Members of ZAKA including Otmazgin and Simcha Greiniman claimed to have photos depicting genital mutilation, including nails and knives inserted into the groin and genitals.[27] These were shared with the UN's Special Representative of the Secretary-General on Sexual Violence in Conflict Pramila Patten's fact-finding mission, as well as NBC News. Both concluded these claims could not be verified based on the provided photos.[28]
During the early stages of the war, a video described as "Hamas executes people by throwing them off a roof of a building!" was shared widely on social media.[29] But the video did not depict Hamas, or any other group based in Palestine, it was a misrepresented video of ISIS in Iraq, from 2015.[29] A July 2015 report from Al Arabiya, included identical images and that were originally shared by ISIS, and showed the execution of four gay men by ISIS in Fallujah, in Iraq.[29]
In the aftermath of the initial Hamas assault, witnesses from the Israeli soldiers, the Israeli Forces, and the first responder Israeli organization ZAKA said on French Israeli TV channel i24news that they had seen the bodies of beheaded infants at the site of the Kfar Aza massacre.[30][31][32] During Antony Blinken's visit to Israel, he said he was shown photos of the massacre by Hamas of Israeli civilians and soldiers, and specifically that he saw beheaded IDF soldiers.[33] U.S. President Biden separately said that he had seen photographic evidence of terrorists beheading children, but the White House later clarified that Biden was alluding to news reports of beheadings, which have not contained or referred to photographic evidence.[34] NBC News called reports of "40 beheaded babies" unverified allegations,[34] adding that they appeared "to have originated from Israeli soldiers and people affiliated with the Israel Defense Force" and that "an Israeli official told CNN the government had not confirmed claims of the beheadings".[34] The allegation mainly "stemmed from a viral Israeli news broadcast clip" and the main X / Twitter accounts propagating the claims were i24NEWS and Israel's official account, even though Israeli Defense spokesperson Doron Spielman told NBC News that he could not confirm i24NEWS's report.[34] As of 12 October, CNN had extensively reviewed online media content to verify Hamas-related atrocities but found no evidence to support claims of decapitated children.[35]
In a speech to the Republican Jewish Coalition on 28 October, Eli Beer, founder of Israeli volunteer EMS group United Hatzalah, claimed that Hamas had burned a baby alive in an oven.[36] He attributed the claim to a United Hatzalah volunteer; one of them, Asher Moskowitz, also publicly made the claim.[37] It was repeated by journalist Dovid Efune, commentator John Podhoretz and others, in tweets seen over 10 million times. Israeli journalists and police found no evidence for the claim, and a representative of ZAKA, a first responder organization, said the claim was "false".[36][38][39]
United Nations (UN) Director-General Antonio Guterres has accused Israel of spreading misinformation about the war in Gaza in an attempt to lower the credibility of the UN.[40]
"I've heard the same source many times saying that I never attacked Hamas, that I never condemned Hamas, that I am a supporter of Hamas. I asked for a statistic to be made by our colleagues. I have condemned Hamas 102 times, 51 of them in formal speeches. The others in different social platforms. So, I mean, the truth in the end always wins." United Nations Director-General Antonio Guterres.[40]
Viral claims that the IDF had destroyed Gaza's Church of Saint Porphyrius on 9 October were debunked by the church.[41][42] Subsequently, an Israeli airstrike hit the premises of the church on 19 October, killing 18.[43]
In October 2023, disinformation experts uncovered an account on X that published false reports about Qatar threatening to cut off its gas exports if Israel continued to bombard the Gaza Strip.[44]
Pro-Hamas accounts have misrepresented footage from the Syrian civil war as showing children being killed in Gaza.[45]
In February 2024, Israel's official X account posted a 30-second video listing the humanitarian aid it claimed to have provided for Gaza. The video included March 2022 footage of a camp in Moldova for Ukrainian refugees. The same account later deleted the video, and stated that "the photo was for illustrative purposes and we should have stated that in the video."[46][47]
Following the Al-Ahli Arab Hospital explosion, an X account claiming to be an Al Jazeera journalist said they had video of a "Hamas missile landing in the hospital". Al Jazeera subsequently clarified that they were not associated with the account, and it was later removed.[48] Another X account that promoted pro-Kremlin misinformation claimed The Wall Street Journal had reported that the explosion was caused by a Mark 84 bomb; The Wall Street Journal had not published such a report.[49][50]
In November 2023, a video appearing to show a nurse at the Al Shifa hospital went viral. She claimed that she was unable to treat patients because Hamas had taken over the entire hospital and were stealing fuel and medicine, with the video ending with her pleading for all Palestinians to leave Al Shifa. Many were quick to point out the falsehood in the video, as none of the documented doctors and nurses at the hospital recognize the woman depicted, and a reported Israeli accent and inability to speak clear Arabic.[51][52] Additionally, according to Esther Chan from RMIT FactLab CrossCheck, an analysis by open-source investigators had determined that the video was likely doctored to artificially include fake sounds of explosions.[53] The video was originally posted on the Ministry of Foreign Affairs of Israel's Arabic Twitter account and it was boosted by Edward Haïm Cohen Halala, who has reported ties to the Israeli government,[citation needed] who has a popular social media presence with an Arabic following.[54][55][56]
Videos of atrocities in Gaza have been dismissed as acting, with people falsely accused of being "crisis actors". A derogatory and dismissive phrase, "Pallywood" is often used; the term is based on a fringe theory that Palestinians are falsifying evidence of suffering. The fact-checking organisation Logically found that mention of the term has increased since October 7, particularly in Israel, the United States, and India.[57][58] Evidence that was falsely used to "prove" Palestinians were crisis actors include a video of body bags which appear to be moving, which was instead a video of a 2013 protest in Egypt.[58]
Saleh Aljafarawi, a Palestinian blogger and singer who lives in Gaza, was falsely accused by several pro-Israeli figures, including the country's official Twitter account, of being a "crisis actor".[59][45][60] This included video of a Palestinian teenager wounded in a raid on Tulkarm in July 2023, who was falsely presented as "Saleh in a hospital days before October 27".[59]
In November 2023, Israeli diplomat Ofir Gendelman circulated a clip from a Lebanese short film, claiming that it was proof that Palestinians were faking videos and calling it an example of "Pallywood".[61][62] According to The Daily Beast, "Gendelman is a repeat offender when it comes to peddling misinformation about Palestinians."[61] The previous week, Gendelman peddled IDF training videos as war footage, and in 2021, he was found by international media to have misrepresented 2018 footage from Syria as current footage from Gaza.[61]
A video showing a Palestinian child killed during an October 11 Israeli airstrike on Zeitoun has been falsely claimed to be staged using a doll. The claim has been promoted by official Israeli government social media accounts, including the X accounts of Israel's embassies in France and Austria, as well as pro-Israel and anti-Hamas accounts.[19][57]
In early December, The Jerusalem Post published an article falsely claiming that a dead 5-month-old Palestinian baby from Gaza was "a doll". The Jerusalem Post would later delete the article and remove any mention of it on their social media pages. Though not mentioning the article directly, they published a statement saying that "The article in question did not meet our editorial standards and was thus removed".[63][64][62]
On 25 March 2024, Al Jazeera took down its video of a woman named Jamila al-Hissi who said that Israeli soldiers had "raped women, kidnapped women, executed women, and pulled dead bodies from under the rubble to unleash their dogs on them" at Al-Shifa hospital in its latest siege. Former managing director of Al Jazeera, Yasser Abu Hilalah, wrote on X, "Hamas investigations revealed that the story of the rape of women in Shifa Hospital was fabricated." Abu Hilalah reported that al-Hissi "justified her exaggeration and incorrect talk by saying that the goal was to arouse the nation's fervor and brotherhood".[65]
On 22 May 2024 the AP reported that two ZAKA volunteers made false statements about sexual violence and rape on October 7.[25] Chaim Otmazgin, a ZAKA commander, claimed he found a raped woman due to her pants having been pulled down below her waist. However, this had been caused by her body being moved by a group of soldiers.[25] Yossi Landau, another ZAKA volunteer, claimed he found a pregnant woman killed with a fetus removed from her womb. This was also proven to be false.[25]
Shortly after October 7, Cochav Elkayam-Levy, a legal expert from the Davis Institute for International Relations at Hebrew University of Jerusalem, former Israeli government lawyer, former member of the military spokesperson's unit and close associate of Prime Minister Netanyahu's, established the "Civil Commission on October 7th Crimes by Hamas against Women and Children", which aimed to give voice to the victims and their families.[66] In June 2024, The Times reported that Elkayam-Levy spread a "debunked story" about a "pregnant woman and her slaughtered foetus", while also circulating "photographs of murdered female soldiers that turned out to be images of Kurdish fighters in Syria."[66] The Times adds: "Elkayam-Levy has nonetheless remained the most prominent public voice on the sexual violence of October 7, winning the country’s highest civilian honour, the Israel Prize, in April."[66]
The Gaza Health Ministry's figures are considered to be reliable.[67][68]
Abraham Wyner, a Pennsylvania professor of statistics, wrote in Tablet that the GHM casualty figures were "faked".[69] Wyner's article was analyzed by professor Joshua Loftus of the London School of Economics, who concluded Wyner's article was "one of the worst abuses of statistics I’ve ever seen".[70] Columbia professor Les Roberts said that GHM numbers were accurate and probably even an underestimate.[71] Wyner's main argument was that from Oct. 26 - Nov. 10, the number of deaths per day is 270 with "strikingly little variation".[69] CalTech statistician Lior Pachter responded that Wyner had cherrypicked a particular period, outside of which the variance was higher; and even within Wyner's picked window the daily deaths had a standard deviation of 42.25 and variance of 1,785.[72][73] Wyner also said that there were peculiarities in the data, such as a strong negative correlation between the daily death counts of men and women.[69] In response, Marine Corps professor James Joyner quoted an opinion that GHM updates total deaths immediately, but there is a lag in updating the proportion of women and children, making time correlations "meaningless".[74]
Israel has released several pieces of incorrect or disputed information, leading to questions about its credibility.[75] On claims linking Palestinian militants to sexual assaults on Oct 7, The Times has remarked that investigations have been hampered by "false and misleading information" spread by "senior [Israeli] political figures and government-linked civil activists".[76] A UN report on these allegations has stated that Israeli authorities have been unable to produce the evidence politicians said existed.[76]
Writing for openDemocracy, British academic Paul Rogers stated, "Israel must maintain the pretence of an orderly war with few civilians killed. Netanyahu's government is lying, but it would be naive to expect otherwise. Lying is what many powerful states routinely do, particularly in wartime."[77] In The Intercept, investigative journalist Jeremy Scahill wrote, "At the center of Israel's information warfare campaign is a tactical mission to dehumanize Palestinians and to flood the public discourse with a stream of false, unsubstantiated, and unverifiable allegations."[78]
On multiple occasions, analyses have found issues with IDF claims. In October 2023, a Financial Times analysis on a bombing of Palestinians evacuating Gaza City found that "most explanations aside from an Israeli strike" could be ruled out, though the IDF blamed the attack on Palestinian militants.[79][80] In November 2023, analysis by the BBC found that video released by the Israeli military following the Al-Shifa Hospital siege had been edited despite IDF claims to the contrary.[81]
In December 2023, an analysis by The Washington Post confirmed reports by Human Rights Watch that Israel had used white phosphorus in an attack on Lebanon,[82][83] directly contradicting the IDF.[84] In January 2024, after an Israeli airstrike killed journalist Hamza Dahdouh, the IDF called Dahdouh a "suspect" who was hit while driving with a "terrorist"; however, The Washington Post found "no indications that either man was operating as anything other than a journalist that day".[85]
After reports spread that a mother and daughter were killed by Israeli snipers in December 2023 in a church where a number of Palestinian Christians sheltered, the Israeli army denied targerting the compound, but claimed instead there was Hamas activity in its vicinity and Israeli soldiers shot back.[86] Catholic officials and Member of British Parliament Layla Moran, who maintained contact with refugees in the church, stated, on the contrary, that no Palestinian belligerents were in the area and that the two women had been killed by the Israeli army, who were the ones preventing the refugees from leaving.[87][88][89]
In November 2023, a video posted by the IDF showed Daniel Hagari, inside the Al-Rantisi Children's Hospital, where he claimed that the IDF had found Hamas weapons and technology, as well as a "list of terrorist names" in Arabic with the title "Operation Al-Aqsa Flood", showing each agents' rota guarding the hostages. However, a translation of the document showed that it contained no names but instead a calendar of the days of the week.[90] After the questioning of the veracity of the claim, an Israeli spokesperson backtracked, but CNN, while removing the segment, did not provide an editors' note acknowledging the change or the dispute over the initial video.[91]
In regards to the March 2024 Flour massacre, a CNN investigation said that "Mark Regev, the Israeli prime minister’s special adviser, initially told CNN that Israeli forces had not been involved." However the IDF said soon after that "soldiers had not fired directly on Palestinians seeking aid, but rather fired 'warning shots' in the air."[92] Al- Jazeera reported evidence of a "large number" of gunshot wounds from a United Nations team, medical professionals and witnesses.[93] The New York Times also reported that witness accounts differed from the Israeli military account who described extensive shooting after thousands massed around aid trucks.[94] IDF drone footage edits out the events causing the crowds to disperse and rejected a CNN request for the full unedited footage.[95] The CNN investigation cast doubt on other IDF claims such as the timing of shooting.[95] Several days after the attack, a senior crisis response adviser at Amnesty stated, "There is concrete evidence that contradicts whatever statements are being made by the Israeli authorities".[96]
After bombing a tent camp in Rafah in an area that Israel had designated as a "safe zone" for civilians, killing 45 people, Israeli officials initially told their American counterparts that they believed their airstrike ignited a nearby fuel tank, creating a large fire.[97] In one video, an unnamed Gazan narrator said the explosion was caused by a "Hamas jeep loaded with weapons".[98] Later, the IDF suggested that a militant warehouse containing ammunition or "some other material" in the area caused the fire. It also released an Arabic phone call in which they clearly say that the Israeli missile was not responsible for the fire, that the fire was caused by secondary explosions, and the secondary explosions came from an ammunition warehouse.[99] However, James Cavanaugh, who worked at the ATF, said the fire did not indicate "some giant stash that exploded."[100] The New York Times viewed numerous videos and did not find evidence that a significant secondary explosion was ignited.[101]
The Israeli army also denied responsibility for the killing of 5-year-old Hind Rajab, her family and the Palestine Red Crescent Society paramedics sent to rescue her, saying that their forces were not in firing range on the day of the girl's death. However, both Al Jazeera and The Washington Post concluded, based on investigation of satellite imagery, that Israeli armored vehicles were indeed in the area at the time.[102][103]
In other instances, Israeli forces' claims have been questioned based on an apparent lack of evidence. In December 2023, Israel stated there was a Hamas tunnel network connected to the Al-Shifa Hospital; however, a report by The Washington Post found "There is no evidence that the tunnels could be accessed from inside hospital wards".[104] Over the course of the war, repeated Israeli attacks on hospitals were justified by the Israeli military's claims that the hospitals were used by militants.[105] The Associated Press, however, stated that after months of investigations, it found that Israel had provided "little or even no evidence" of a significant militant presence near the al-Awda, Indonesian, or Kamal Adwan hospitals prior to their raids.[106] Israel claimed 12 UNRWA staff members had participated in the 7 October attack on Israel; however, the Financial Times, Sky News, and Channel 4 all stated that Israel's claims were not proven by the intelligence documents they reviewed.[107][108][109] In February 2024, the IDF stated Hamas was stealing humanitarian aid, leading David M. Satterfield, a senior U.S. envoy, to say there was no evidence to support Israel's claims.[110]
In October 2023, shortly after the Al-Ahli Arab Hospital explosion, Israeli sources published audio purporting to show two Hamas militants in a phone call claiming responsibility for the act and blaming it on a malfunctioning rocket. Hamas claimed the recording was an "obvious fabrication", and the British Channel 4 interviewed two independent Arab journalists who expressed similar views.[111]
On 4 December 2023, Haaretz reported on Israeli claims about beheaded babies, stating that these "unverified stories [had been] disseminated by Israeli search and rescue groups, army officers and even Sara Netanyahu".[112][113][a] Haaretz journalists Nir Hasson and Liza Rozovsky related the chronology of the news items about "beheaded babies" and "hung babies" and concluded, "this story is false."[112] They quoted Ishay Coen, a journalist for the ultra-Orthodox website Kikar Hashabbat, who admitted he made a mistake by unquestioningly accepting the IDF's statements.[112] "Why would an army officer invent such a horrifying story?", Hashabbat asked, adding, "I was wrong."[112]
In September 2024, CNN hosts Jake Tapper and Dana Bash falsely accused U.S. House Representative Rashida Tlaib of stating that Michigan Attorney General Dana Nessel was unable to do her job because of her religion—something Tlaib never stated.[115] Bash and Tapper were repeating claims first made by Nessel in a social media post criticizing a racist caricature suggesting Tlaib was a member of Hezbollah.[116] Steve Neavling, the Metro Times journalist who conducted the original interview with Tlaib, called the claims against her "spurious".[117]
The October 7 attack by Hamas on Israel has become the subject of various conspiracy theories. These theories claim that the attack, which resulted in approximately 1,200 deaths in Israel, was a false flag operation conducted by Israel itself, despite the overwhelming evidence provided by multiple sources, including smartphone and GoPro footage capturing the breach of the border by Hamas forces.[118][119]
This misinformation has been proliferating across various social media platforms, where hashtags linking Israel to "false flag" operations have seen a significant increase in usage. This spread of falsehoods was not limited to online spaces; it has manifested in real-world scenarios, including city council meetings and public protests, where individuals have publicly denied the facts of the attack.[118]
Researchers and Jewish community leaders have expressed concern about the ties these conspiracy theories have to Holocaust denial and other antisemitic beliefs, with denial of the October 7 attacks described as part of a broader pattern of misinformation that seeks to distort historical events and promote antisemitic narratives.[118]
Another unsubstantiated conspiracy theory that emerged following the October 7 Hamas attack suggests that the Israeli government, specifically Prime Minister Benjamin Netanyahu, had prior knowledge of the attack. Some even claim that Netanyahu issued a "stand-down" order to the Israeli military. The genesis of this theory appears to be from Charlie Kirk, a far-right influencer and supporter of former U.S. President Donald Trump, whose comments on a podcast fueled these claims. However, these assertions hinge solely on Kirk's personal speculations. Despite a lack of evidence, the theory has been influential in certain circles, especially among those critical of Netanyahu's leadership and Israeli policies.[120]
A fake memo that purported to show Biden authorizing $8 billion in aid to Israel circulated on social media[121][122] and was cited in articles by Indian news outlets Firstpost and Oneindia.[122]
According to information security experts interviewed by the New York Times, Iran, Russia, China, Iran's proxies, Al Qaeda and the Islamic State have been conducting massive online disinformation efforts focused on "[undercutting] Israel, while denigrating Israel's principal ally, the United States".[123] Researchers have documented at least 40,000 bots or fake social media accounts, as well as strategic use of state-controlled media outlets like RT, Sputnik and Tasnim.[123] An analysis by Haaretz found that hundreds of fake accounts on social media were targeting Democratic Party lawmakers with spam messages repeating Israeli government accusations relating to UNRWA and Hamas.[124]
A Russian disinformation campaign known as Doppelganger has pushed false information about the war using fake websites that mimic the appearance of news sources such as Fox News, Le Parisien and Der Spiegel.[125][126]
In February 2024, Volker Türk, the UN human rights chief, stated that the United Nations had been the subject of disinformation attacks, saying, "The UN has become a lightning rod for manipulative propaganda and a scapegoat for policy failures."[127]
In June 2024, Israel's Ministry of Diaspora Affairs was revealed to have paid $2 million to Israeli political consulting firm STOIC, to conduct a social media campaign, fueled by fake accounts and often employing misinformartion, targeting 128 American Congresspeople, with a focus on Democratic and African-American members of the House of Representatives. Websites were also created to provide young, progressive Americans with Gaza news with a pro-Israel spin. Among the objectives of the campaign was amplifying Israeli attacks on UNRWA staffers and driving a wedge between Palestinians and African-Americans to prevent solidarity between the two groups. The campaign also took aim at people in Canada, who were exposed to Islamophobic content smearing Canadian Muslims and implying that pro-Palestinian protesters aimed at imposing Sharia law. Messages were also directed to people in the Gulf Arab countries, arguing that humanitarian concern for Palestinians was a wasteful distraction from local affairs.[128][129]
In September 2024, the IDF stated it was launching an investigation into the release of forged Hamas documents that were leaked to the international press, apparently in an attempt to sway Israeli public opinion against a hostage-ceasefire deal.[130]
Videos falsely linked to the war included a video of children in cages posted on 4 October,[131][132] footage from 2020 of Iranian lawmakers chanting "Death to America",[133][134] and in Egypt, photos of the Cairo Tower appearing to be lit with the Palestinian flag spread on social media, which turned out to be a modified version of the tower in 2010.[135] Footage from video game Arma 3 has been presented as war footage.[136][137][138][139]
On October 8, a video supposedly of Hamas thanking Ukraine for supplying them was shared by an X account linked to the Wagner Group. It was viewed over 300,000 times and shared by American far-right accounts. The next day, former Russian president Dmitry Medvedev tweeted, "Well, Nato buddies, you've really got it, haven't you? The weapons handed to the Nazi regime in Ukraine are now being actively used against Israel."[140][141][142]
Social media users on both sides of the war shared behind-the-scenes footage of an actor lying in fake blood from a 2022 Palestinian short film, alleging it was evidence that the other side was creating propaganda.[143][144][140] A video of Egyptian paratroopers flying over the Egyptian Military Academy that was falsely claimed to show Hamas militants infiltrating an Israeli music festival went viral on X in Indonesia.[145]
Indian Twitter accounts spread an out-of-context video claimed to represent "dozens of young girls taken as sex slaves by a 'Palestinian' fighter", which was instead actually probably a school trip to Jerusalem. Another clip primarily shared by Indian users was purported to depict a kidnapped baby; however, the video was taken a month earlier and had nothing to do with Gaza.[18]
An AI-generated video of model Bella Hadid supposedly apologising for her past remarks and expressing support for Israel circulated on social media.[146][147][148]
Disinformation about the war has spread on social media platforms, particularly X (formerly known as Twitter).[149][150][151][71][69] The European Union warned Elon Musk and Mark Zuckerberg that X and Meta were hosting disinformation and illegal content about the war, with potential fines of up to 6% of the companies' global revenue according to the Digital Services Act.[152][153][154][155]
In response to the reports, X's CEO Linda Yaccarino told EU internal market commissioner Thierry Breton that it had "taken action to remove or label tens of thousands of pieces of content" and removed hundreds of accounts linked to Hamas.[156]
According to NewsGuard, "at least 14 false claims related to the war garnered 22 million views across X, TikTok, and Instagram within three days of the Hamas attack."[157] On 13 October, the EU opened an investigation into X about the spread of disinformation and terrorist content related to the war.[158][159]
On 14 October, Center for Countering Digital Hate CEO Imran Ahmed said his group was tracking a spike in efforts to push false information about the war, adding that U.S. adversaries, extremists, Internet trolls and engagement farmers were exploiting the war for their own gain. Graham Brookie, senior director of the Atlantic Council's Digital Forensic Research Lab, said that his team had witnessed a surge in terrorist propaganda, graphic content, false or misleading claims and hate speech, with much of the content being circulated on Telegram.[160] Cyabra, an Israel-based company that analyses social media, said that one in five accounts taking part in conversations about Hamas' attacks were fake, adding that they had found approximately 40,000 such accounts on X and TikTok.[161]
According to the New York Times, many images and videos that circulate on social media pretending to be from the Israel–Hamas war are in fact from other conflicts, such as the Syrian civil war; and even of natural disasters, such as a recent flood in Tajikistan.[162]
According to AP's David Klepper, "pictures from the Israel-Hamas war have vividly and painfully illustrated AI's potential as a propaganda tool, used to create lifelike images of carnage... digitally altered ones spread on social media have been used to make false claims about responsibility for casualties or to deceive people about atrocities that never happened."[163]
Cyabra, an Israeli social media intelligence company[undue weight? – discuss] found that on the day after the attack, one in four posts about the conflict on Facebook, Instagram, TikTok and X were from fake accounts. The New York Times described the start of the Israel–Hamas war as releasing a "deluge of online propaganda and disinformation" that was "larger than anything seen before". It described the conflict as "fast becoming a world war online" and stated that Russia, China, Iran and its proxies had used state media and covert influence campaigns on social media networks to support Hamas, undermine Israel, criticize the United States and cause unrest.[3] James Rubin of the U.S. State Department's Global Engagement Center called coverage of the conflict as being swept up in "an undeclared information war with authoritarian countries".[3]
On 9 October, X said there were more than 50 million posts on the platform about the conflict.[71] Musk recommended two accounts that previously promoted a false claim about an explosion near the Pentagon for updates about the war.[164][150]
On 10 October, researchers found that a network of 67 X accounts was coordinating a campaign of pushing false information about the war.[165]
According to Wired, the community fact-checking system of X, Community Notes, has in some instances contributed to the spread of disinformation instead of correcting it. Wired cited an incident where a video uploaded by Donald Trump Jr. of Hamas shooting at Israelis was inaccurately tagged as a false video from several years ago as an example of the unreliability of Community Notes.[166] Fake accounts pretending to be a BBC journalist and The Jerusalem Post promoted false information about the war prior to X suspending them.[167][69]
On 12 October, the Technology Transparency Project reported that Hamas was using premium accounts on X to push propaganda.[168][169] X said it has banned Hamas and removed hundreds of accounts affiliated with Hamas.[170]
On 13 October, on The World radio program, Rebecca Rosman reported that disinformation on X was being monetized by paid-verified users with "new-content" recommendation preference, resulting in millions of views.[171]
According to a report by NewsGuard on 19 October, verified users on X were behind 74% of the 250 most-engaged posts between 7 and 14 October that promoted false or unsubstantiated information about the war. NewsGuard also found that only 79 of the 250 posts were flagged by Community Notes.[172][173][174][175][176]
On October 28, commentator Jackson Hinkle posted on X that Haaretz had reported that the Israeli government inflated the death toll for the 2023 Hamas attack on Israel. Haaretz stated that Hinkle's post "contain[ed] blatant lies" and was not substantiated by their reporting on the attack.[177] Hinkle also claimed that the image of a Jewish baby burned alive by Hamas on October 7 "was created by artificial intelligence". He was subsequently deplatformed from YouTube.[178][179][180]
Syrian YouTuber Maram Susli claimed that footage showed Israeli military helicopters firing on Israelis escaping the October 7 massacre at the supernova festival, carried by Hamas. However, footage resulted to be from Israeli attacks on Hamas positions in Gaza three days later.[181] She also posted a photograph of a woman carrying a child's toy car down the stairs of a largely destroyed building suggesting it was Gaza after Israeli attacks. The picture was actually an award-winning photograph taken in Homs during the Syrian civil war.[182][183][184][185]
An investigation by ProPublica and Columbia University's Tow Center for Digital Journalism found that verified accounts promoting misinformation about the conflict saw their audience grow significantly during the first month of the conflict, and that Community Notes had failed to scale sufficiently, with 80% of the debunked tweets reviewed not being clarified with a note.[186][187]
On 12 October, the EU warned TikTok about illegal content and disinformation on its platform.[188][189] On 15 October, TikTok said it had taken action to remove "violative content and accounts".[190] It also said it had established a command center for the conflict, updated its automated detection systems to detect violent content and added moderators who speak Arabic and Hebrew.[190][191][192] A TikTok video promoting conspiracy theories that Hamas's attack had been orchestrated by the media was viewed over 300,000 times.[192]
By mid-November 2023, Republican U.S. Representative Mike Gallagher had claimed that TikTok was "intentionally brainwashing" American youth into supporting Hamas, citing the spike in pro-Palestinian content following the outbreak of hostilities between Israel and Hamas. In response to criticism, TikTok issued a press release on 20 November asserting that younger Americans, particularly Millennials and Generation Z, tended to be more sympathetic to the Palestinians than to Israel, citing Gallup polling data dating back to 2010. TikTok also claimed that its algorithm did not take sides but operated in a positive feedback loop based on user engagement. The company also denied favouring "one side of an issue over another" or intentionally promoting pro-Palestinian hashtags such as "#freepalestine," which had attracted 25.5 billion views by November 14. By comparison, "#standwithisrael" had attracted 440.4 million views. TikTok's press release also stated that it had removed 925,000 videos related to the conflict for violating community standards, including promoting Hamas, had hired moderators fluent in Arabic and Hebrew to parse content, and begun removing fake accounts created in response to the Israel-Hamas conflict.[193]
According to a TheMarker report, neo-Nazi propaganda, antisemitic content, and calls for the destruction of Israel were all circulating on TikTok throughout the war.[194]
The Al-Qassam Brigades, Hamas's military wing, had around 200,000 followers on Telegram at the time of Hamas's attack. According to the Digital Forensic Research Lab, its following has tripled since then, with its posts being viewed over 300,000 times.[170][195] The Digital Forensic Research Lab found that Hamas relies on Telegram to send statements to its supporters.[195]
According to political analyst and researcher Arieh Kovler, many Israelis follow official-sounding Telegram channels that share out-of-context videos and unverified rumors.[195]
In a statement, Telegram said it was "evaluating the best approaches and... soliciting input from a wide range of third parties" and that it wished to be "careful not to exacerbate the already dire situation by any rush actions".[195]
In October 2023, Arma 3 developer Bohemia Interactive said in a statement, "With the tragic events currently unfolding in the Middle East, we feel it is vital to share once again our statement concerning the use of Arma 3 as a source of fake news footage. It's disheartening for us to see the game we all love being used in this way. While we have found ways to tackle this issue somewhat effectively by closely cooperating with leading fact-checking agencies, sadly we can't mitigate it entirely."[196]
In November 2023, Center for Countering Digital Hate CEO Imran Ahmed said that misinformation about the war was as difficult to track as COVID-19 misinformation and misinformation about the 2020 United States presidential election.[197]
In January 2024, McDonald's CEO Chris Kempczinski said, "Several markets in the Middle East and some outside the region are experiencing a meaningful business impact due to the war and associated misinformation that is affecting brands like McDonald's."[198] The boycotts started after McDonald's Israel announced it had donated free meals to IDF soldiers involved in the war.[199]
In February 2024, Bellingcat founder Eliot Higgins said, "I think the intensity of online discourse around Israel and Palestine is really kind of much worse than I've seen in any of the conflicts. People are not looking to establish the truth in many cases, but basically just look for things to bash each other over the head online. It's really just about people arguing their positions, their opinions, and not really establishing the exact truth around what's happening."[200]
Speaking about Israel's decision not to allow foreign journalists into Gaza, UN secretary-general Antonio Guterres stated, "Denying international journalists entry into Gaza is allowing disinformation and false narratives to flourish."[201] The technology director of the Institute for Strategic Dialogue stated, "The corrosion of the information landscape is undermining the ability of audiences to distinguish truth from falsehood on a terrible scale."[202]
An IDF source told the JC the Hamas claim about Commander Aloni was "unclear and unconfirmed".
In the meantime, Zaka volunteers were there. Most of them worked at the sites of murder and destruction from morning to night. However, according to witness accounts, it becomes clear that others were engaged in other activities entirely. As part of the effort to get media exposure, Zaka spread accounts of atrocities that never happened, released sensitive and graphic photos, and acted unprofessionally on the ground.
The unit's soldiers, as well as volunteers from other organizations, accused ZAKA volunteers of spreading stories of horrors that didn't happen, releasing sensitive and graphic photos to shock people into donating, and being unprofessional in a bid for screen time.
Patcher writes: "Again, the baseline model for count data posits a Poisson distribution on the numbers, which in this case would represent a variance of 270. A compound Poisson process makes more sense in this case, and such a process would have even higher variance. Here we see exactly that, a variance of 1785, which is more than 6x what one would see in a Poisson process. If the author thinks the variance should be even higher than *that*, he needs to provide an argument for why, and point to historical data. Of course in this case the variance is even higher, because he appears to have "cherry" picked the 15 days."
Additionally, commenter Ken M adds this insight: "If you look at the numbers, it's very clear that they update fatalities faster than the update #women or #children (and they don't specify #men, that is just (#fatalities – #women-#children)). On some days fatalities update but there is no change in the #w or #c; on other days the increase in (#w+#c) exceeds the increase in #f. In other words, in the conditions of war, it is hard to get information. The Gazan Ministry of Health (GMH) makes a list of the name and ID # of every identifiable death; Israel maintains the registry of ID #'s so GMH can't fake that. That's why their numbers come out accurate. But in real time, they may get a number of fatalities from a hospital and get the names, which allow identification of #w or #c, only later, maybe much later. And if they get the list of names, they have to go through the registry to determine who is a child or an adult, and maybe for ambiguous names who is a woman or a man, and that probably takes time too. So #w and #c get updated with arbitrary lags, sometimes multiple days worth may suddenly get updated at once. So looking at day-by-day movements of these #'s is meaningless."
Following Saturday's large-scale attack by Hamas militants on Israeli targets, including the killing and hostage taking of civilians, misleading footage purporting to show the military escalation of the conflict flooded social media — including a slew of clips from Arma 3, a hyper realistic open-world combat video game that allows users to customize gaming scenarios.