Cliff Notes – Istanbul earthquake leads to misinformation
- Following a recent earthquake in Istanbul, over 150 people suffered injuries, mainly from jumping from buildings, but no fatalities have been reported.
- Misinformation and fake videos, including AI-generated content and outdated footage, rapidly circulated online, causing confusion and panic among the public.
- Recent research reveals that many viral videos falsely claiming to show the destruction from the earthquake in Istanbul are either from previous seismic events or fabricated entirely.
Istanbul earthquake leads to misinformation
Istanbul, Turkey’s largest city, was hit by a strong earthquake on Wednesday. Over 150 people were injured when they jumped or attempted to jump from buildings, Istanbul’s governor Davut Gul said, adding that “their injuries were not life-threatening” and nobody had died in the quake.
Following events like this, images and videos begin to circulate on social media — content creators are eager to gain views and increase their number of followers. However, a study shows that misinformation during and after natural disasters often leads to panic and hinders effective disaster management.
Many videos circulating online following the earthquake in Turkey do not depict the recent event. Here are three examples.
AI-generated images circulating
WTX fact check: Fake – In breaking news situations, many AI-generated images and videos circulate online
This video is AI-generated, evident when looking closely at several details. For instance, at the beginning of the video, on the left-hand side, the top of a streetlamp appears and then disappears, while the people at the bottom of the video do not move at all. Seconds later, a man on the left, purportedly standing in front of a collapsed building, can be seen with a blurry arm that vanishes and then reappears.
Additionally, the creator marked the video as being AI-generated, which many viewers overlooked. Comments such as “Is it real?,” “Pray for Turkey,” and “Stay strong Turkey” suggest that users believe the footage to be authentic.
In breaking news situations, such as natural disasters, it’s common for AI-generated videos to almost immediately begin circulating online.

Old earthquake videos reappear
Claim: This video allegedly shows the dramatic collapse of a tall building in Istanbul, with many people running away. The caption says it depicts the earthquake in Istanbul, and the video was posted on April 23. Less than 24 hours after the quake, it had garnered over 5 million views.
WTX fact check: False – This video does not show the aftermath of the April 2025 earthquake
More than 3,000 people commented on the video, with many believing it depicts the latest earthquake in Istanbul. However, that is not the case. A reverse image search shows that the video dates back to February 6, 2023. Back then, a magnitude 7.8 earthquake struck Turkey and Syria. The epicentre was close to the Syrian border. More than 56,000 people were killed and hundreds of thousands displaced.
In the 2023 disaster, many buildings crumbled to dust, raising concerns about construction safety standards.

Real earthquake ruins, but an old image
Claim: “My deepest condolences to the people of Turkey after the devastating earthquake in Istanbul. I stand with all those affected by this tragedy. Sending thoughts and prayers to the families who lost loved ones and to everyone impacted. Stay strong, Turkey,” a user posted on X, sharing a bird’s-eye view photo which shows many collapsed buildings.
WTX fact check: False – This image does not depict the latest earthquake in Turkey, it is from 2023
Following the earthquake in Turkey, misleading, fake and outdated content has spread rapidly online. With ongoing aftershocks in Istanbul, further false claims and images are likely to surface.
