Facebook, YouTube and Twitter struggle to deal with New Zealand shooting video

This is an archived article and the information in the article may be outdated. Please look at the time stamp on the story to see when it was last updated.

Facebook, YouTube, and Twitter are struggling to halt the spread of horrific footage that appears to show a massacre at a mosque in New Zealand as it was taking place.

Dozens of people were killed Friday in shootings at two mosques in the city of Christchurch.

One of the shooters appears to have live-streamed the attack on Facebook. The disturbing video, which has not been verified by CNN, purportedly shows a gunman walking into a mosque and opening fire.

“New Zealand Police alerted us to a video on Facebook shortly after the live stream commenced and we quickly removed both the shooter’s Facebook and Instagram accounts and the video,” Mia Garlick, Facebook’s director of policy for Australia and New Zealand, said in a statement.

What we know

Hours after the attack, however, copies of the gruesome video continued to appear on Facebook, YouTube and Twitter, raising new questions about the companies’ ability to manage harmful content on their platforms.

Facebook is “removing any praise or support for the crime and the shooter or shooters as soon as we’re aware,” Garlick said.

Twitter said it suspended an account related to the shooting and is working to remove the video from its platform.

YouTube, which is owned by Google, removes “shocking, violent and graphic content” as soon as it is made aware of it, according to a Google spokesperson.

New Zealand police asked social media users to stop sharing the purported shooting footage and said they were seeking to have it taken down.

CNN is choosing not to publish additional information regarding the video until more details are available.

Tech firms ‘don’t see this as a priority’

This is the latest case of social media companies being caught off guard by killers posting videos of their crimes, and other users then sharing the disturbing footage. It has happened in the United StatesThailandDenmark, and other countries.

Friday’s video reignites questions about how social media platforms handle offensive content: Are the companies doing enough to try to catch this type of content? How quickly should they be expected to remove it?

“While Google, YouTube, Facebook, and Twitter all say that they’re cooperating and acting in the best interest of citizens to remove this content, they’re actually not because they’re allowing these videos to reappear all the time,” said Lucinda Creighton, a senior adviser at the Counter Extremism Project, an international policy organization.

Facebook’s artificial intelligence tools and human moderators were apparently unable to detect the live stream of the shooting. The company says it was alerted to it by New Zealand police.

“The tech companies basically don’t see this as a priority, they wring their hands, they say this is terrible,” Creighton said. “But what they’re not doing is preventing this from reappearing.”

The spread of the video could inspire copycats, said CNN legal enforcement analyst Steve Moore, a retired supervisory special agent for the FBI.

“What I would tell the public is this: Do you want to help terrorists? Because if you do, sharing this video is exactly how you do it,” Moore said.

“Do not share the video or you are part of this,” he added.

Copyright 2021 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.

Trademark and Copyright 2021 Cable News Network, Inc., a Time Warner Company. All rights reserved.

Trending Stories