In our previous blog post on the winning digital marketing strategies of Hamas, we highlighted the power of emotional appeal over logical arguments. Visual storytelling, especially through compelling images, often strikes directly at the heart. But what do you do when you lack genuine, emotive content? Similar to how one might turn to AI solutions like ChatGPT to quickly write blog posts, there’s also Dall-E for generating impactful images.

An intriguing instance reported by The Jerusalem Post  shows an image of an injured baby in Gaza that, despite its evocative nature, is entirely the creation of artificial intelligence. Further investigation, as seen in reports by DW, reveals a series of such AI-generated images, designed to elicit strong emotional responses. These images, at first glance, seem incredibly real, yet they are the handiwork of AI tools.


Dall-E fights in the Israel Gaza War

The widespread acceptance and use of these images, sometimes even finding their way into stock photo libraries and news outlets, underscore the challenges in distinguishing real, current photos from sophisticated AI fabrications. This phenomenon not only feeds into the cycle of misinformation but also raises ethical questions about the use of AI in crafting narratives.

So, how does one discern the authenticity of an image in an era where AI-generated content is becoming increasingly indistinguishable from reality? These tips are relevant now as this blog post is written. AI capabilities are constantly continuing to improve. 

  • Source Verification: Always consider the reliability of the image source. While even reputable sources can err, skepticism is warranted when the source has potential biases or agendas. 
  • Anatomical Inconsistencies: Disproportionate or incorrectly numbered fingers and toes, or limbs appearing in unnatural configurations, are strong indicators of AI manipulation.
  • Textual Accuracy: AI-generated images might include nonsensical text or misrepresentations, especially in context-specific details like storefront signs. If the storefront signs have gibberish written on them that’s a red flag. Incorrect language is also. In Gaza, the storefronts would have Arabic written on them. 
  • Reverse Image Search: Google has a tool called Search by Image. This tool does a reverse image search. Instead of writing in text and asking Google to find you an image, you upload an image and Google will search for that image on the web. Utilizing this tool can help verify the legitimacy of an image, helping to identify reused or contextually misplaced photographs.
  • Patience: Even if you went through all the steps above and the image seems to be real, hold off on forwarding it. Sometimes, with time news is debunked. 

Take Shifa Hospital for example. You may have seen images of the hospital on fire and Israel blamed. Later it turns out that Hamas launched a rocket that landed in Shifa’s parking lot. The pictures weren’t fake, but the context was and it took time for the truth to come out. 

So when you see a shocking image or piece of news, wait at least 24 hours before forwarding.

Have any tips to add? We’d like to hear from you. Contact us by emailing

Popular Posts
Recent Posts