Viral Bridge Collapse Video Is AI-Generated : Grok AI Misleads Users

A viral video of a bridge collapse is AI-generated, despite Grok AI incorrectly claiming it shows a real China incident.
by Anonymous |
February 9, 2026

A terrifying video showing a bridge collapsing due to severe weather, plunging vehicles and people into the river below, has been circulating widely online. The video went viral on ‘X’ (formerly Twitter) with the caption, “This is absolutely terrifying to watch.”
Many internet users questioned the authenticity of the incident and tagged the Artificial Intelligence (AI) assistant ‘Grok’ to verify the footage.
The Fact-Check
However, an investigation by Factseeker revealed significant discrepancies between the video’s reality and the responses provided by Grok. It was observed that the AI tool was consistently misleading users regarding the incident.
The investigation confirmed that the viral claim is false. The video is entirely AI-generated.

Key Findings of the Investigation:
In its replies, Grok claimed the incident was real, citing it as footage from the 2018 Sichuan floods in China. In another instance, it claimed the event took place in Zhaohui County, Shaanxi Province, China, in 2024. Grok also attributed the footage to various other locations in different responses.
Despite these claims, a Reverse Image Search of the video frames yielded no credible social media posts or reports from mainstream media outlets confirming such an incident.
Visual Discrepancies
Upon closer inspection, several visual glitches were identified in the video. The movements of the people appeared highly artificial, and in some frames, individuals seemed to vanish into thin air even before hitting the water.

AI Detection Analysis
To confirm the nature of the footage, we analyzed the video using the Imgdetector.ai tool. The detector confirmed that the visual is 92% likely to be AI-generated, further proving that the content is not authentic.

Grok’s Incorrect Responses
Under the viral post, many X users prompted Grok to confirm if the incident was real. In the majority of its replies, Grok misled users by asserting the video was genuine footage of a bridge collapse in China.

Grok went as far as providing fabricated details, including specific locations, dates, and death tolls. Most notably, the AI assistant confidently claimed that major international news organizations such as CNN, Al Jazeera, The Guardian, Newsweek, People’s Daily, WION, and various Chinese media outlets had published reports on this specific bridge collapse—all of which was entirely untrue.

While Grok did correctly identify the video as AI-generated in a few isolated replies, the bulk of its interactions provided false information.
Note: This is a clear example of “AI Hallucination,” a phenomenon where AI technology misinterprets facts and confidently presents fabricated information as reality.
Conclusion
The viral video does not depict a real-life incident. It is a completely AI-generated visual.
