Facebook has said it removed 1.5 million videos of the New Zealand mosque terror attack in the first 24 hours.
The company and other social media platforms including YouTube, Twitter, Instagram and Reddit have been criticised for being slow to take down the videos.
Internet users shared footage from the gunman’s 17-minute long livestream of the attack, which he originally posted on Facebook.
I understand the frustration behind this disgusting terrorist attack but sharing the livestream uploaded by the gunman is potentially insensitive to the families and loved ones of those who lost their lives. Please respect those boundaries & consider how harmful these images are
— Jaimee (@jcran28) March 15, 2019
Facebook shared a tweet shortly after news of the attack broke, stating it removed the original livestream and deleted the suspect’s account.
Mia Garlick, from Facebook New Zealand, said in a statement:
Our hearts go out to the victims, their families and the community affected by the horrendous shootings in New Zealand.
Police alerted us to a video on Facebook shortly after the livestream commenced and we quickly removed both the shooter’s Facebook and Instagram accounts and the video. We’re also removing any praise or support for the crime and the shooter or shooters as soon as we’re aware.
Police alerted us to a video on Facebook shortly after the livestream commenced and we quickly removed both the shooter’s Facebook and Instagram accounts and the video. We're also removing any praise or support for the crime and the shooter or shooters as soon as we’re aware.
— Facebook Newsroom (@fbnewsroom) March 15, 2019
The company added it has also removed, and is removing, edited versions of the footage out of respect for all those affected.
Garlick said:
We continue to work around the clock to remove violating content using a combination of technology and people.
In the first 24 hours we removed 1.5 million videos of the attack globally, of which over 1.2 million were blocked at upload.
Out of respect for the people affected by this tragedy and the concerns of local authorities, we’re also removing all edited versions of the video that do not show graphic content.
Out of respect for the people affected by this tragedy and the concerns of local authorities, we're also removing all edited versions of the video that do not show graphic content." — Mia Garlick, Facebook New Zealand
— Facebook Newsroom (@fbnewsroom) March 17, 2019
The way social media was used during the attack and in the aftermath has raised the question of how to combat the spread of violent hate online.
Although Facebook, Twitter and other companies have taken numerous steps over the past couple of years to tackle the issue, the events in New Zealand has demonstrated there is still a long way to go.
At a press conference, New Zealand’s Prime Minister Jacinda Ardern said Facebook’ chief operating officer Sheryl Sandberg had reached out to her.
As reported by Reuters, Ardern said she will be discussing the livestream with the company:
Certainly, I have had contact from Sheryl Sandberg. I haven’t spoken to her directly but she has reached out, an acknowledgment of what has occurred here in New Zealand.
This is an issue that I will look to be discussing directly with Facebook.
It is very clear changes need to be made to stop a livestream like it happening again.
If you have a story you want to tell, send it to [email protected].