Removed 1.5 Million Videos of New Zealand Mosque Attack: Facebook

“In the first 24 hours, we removed 1.5 million videos of the (New Zealand) attack globally,” Facebook said.

1 min read
50 people were killed in a terror attack on two mosques in New Zealand’s Christchurch on 15 March. 

Tech giant Facebook took to Twitter late on Saturday, 16 March, to say that it had removed 1.5 million videos globally of the New Zealand mosque attack in the first 24 hours after the incident.

"In the first 24 hours, we removed 1.5 million videos of the attack globally, of which over 1.2 million were blocked at upload...," Facebook said.

The company also said that it was removing all edited versions of the video that do not show graphic content out of respect for the people affected by the mosque shooting and the concerns of local authorities.


The suspect in the shootings, 28-year-old white supremacist Brenton Harrison Tarrant, had posted a jumbled 74-page anti-immigrant manifesto online before the attacks and had apparently used a helmet-mounted camera to broadcast live video of the slaughter.

The gunman had livestreamed 17 minutes of the rampage at the Al Noor mosque, where he had sprayed worshippers with bullets over and over, killing at least 41 people.

Several more people were killed in an attack on a second mosque in the city a short time later, taking the total death toll to 50.

Facebook, Twitter and Google had scrambled to take down the video, which was widely available on social media for hours after the bloodbath.

Five Indians were among the 50 killed in the terror attacks.

(With inputs from AP)

(At The Quint, we are answerable only to our audience. Play an active role in shaping our journalism by becoming a member. Because the truth is worth it.)

Stay Updated

Subscribe To Our Daily Newsletter And Get News Delivered Straight To Your Inbox.

Join over 120,000 subscribers!