Christchurch massacre: Facebook says it removed 1.5 million videos within 24 hours of attack

0
49
CHRISTCHURCH, NEW ZEALAND - MARCH 17: A police officer carrying an automatic rifle guards the area near Al Noor mosque on March 17, 2019 in Christchurch, New Zealand. 50 people are confirmed dead, with 36 injured still in hospital following shooting attacks on two mosques in Christchurch on Friday, 15 March. 41 of the victims were killed at Al Noor mosque on Deans Avenue and seven died at Linwood mosque. Another victim died later in Christchurch hospital. A 28-year-old Australian-born man, Brenton Tarrant, appeared in Christchurch District Court on Saturday charged with murder. The attack is the worst mass shooting in New Zealand's history. (Photo by Carl Court/Getty Images)

Facebook said that it removed 1.5 million videos of footage from the shooting rampage at two mosques in Christchurch within 24 hours of the attack, underscoring the massive game of whack-a-mole social media giants have to play with even the most high-profile problematic content on their platforms.

In a statement, Mia Garlick, spokeswoman for Facebook New Zealand, said that the company continues to “work around the clock to remove violating content from our site, using a combination of technology and people.” Of the 1.5 million videos of the massacre, filmed by a body-worn camera on the perpetrator almost in the style of a video game, 1.2 million were blocked at upload.

Facebook’s statement came after New Zealand Prime Minister Jacinda Ardern said in a Sunday news conference that there were “further questions to be answered” by Facebook and other social media sites over their response to the events.

Ardern said that her country had done as much as it could to “remove or seek to have removed some of the footage” circulated in the aftermath of the attack, but that ultimate it has been “up to those platforms.”

When the horror began Friday morning in New Zealand, alleged shooter Brenton Tarrant’s Facebook followers were the first to know. He live-streamed his assault, from the time he started driving over to Al Noor Mosque to the moments when he fired his first shots.

Many hours later, and long after he and other suspects had been arrested, others were still uploading the video to YouTube and other online video platforms. A Washington Post search of keywords related to the event, such as “New Zealand,” surfaced a long list of videos, many of which were lengthy and uncensored views of the massacre.

And though Facebook, Instagram and Twitter have all removed Tarrant’s accounts, dozens of archived versions remain available, along with the links and videos he shared.

Facebook says that it is using audio technology to detect more versions of the video, allowing it to catch more footage even if there isn’t an exact match to the full version streamed by Tarrant.

On Sunday, the New Zealand government informed online platforms that sharing any version of the footage, even the edited, non-graphic versions, is a violation of the law. Facebook says that since the attack, teams have been also working to remove content in support of the massacre and other hateful posts.

The restrictions have also applied to news media. Local media reported that Sky News Australia was pulled off New Zealand broadcaster Sky TV for airing “distressing footage.”

Ardern acknowledged that the problem of hate speech and the difficulty of controlling the proliferation of violent videos was a global problem.

“But it doesn’t mean we can’t play an active role in seeing it resolved,” she added.

LEAVE A REPLY

Please enter your comment!
Please enter your name here