X

Facebook explores live video restrictions after New Zealand mosque shootings

The social network might limit users from live streaming a video if they've violated Facebook rules before.

Queenie Wong Former Senior Writer
Queenie Wong was a senior writer for CNET News, focusing on social media companies including Facebook's parent company Meta, Twitter and TikTok. Before joining CNET, she worked for The Mercury News in San Jose and the Statesman Journal in Salem, Oregon. A native of Southern California, she took her first journalism class in middle school.
Expertise I've been writing about social media since 2015 but have previously covered politics, crime and education. I also have a degree in studio art. Credentials
  • 2022 Eddie award for consumer analysis
Queenie Wong
2 min read
Christchurch Marks One Week Since Deadly Mosque Attacks

An armed police officer is seen in front of Al Noor mosque during Friday prayers on March 22, 2019 in Christchurch, New Zealand. 

Getty Images

Facebook might limit who can stream live video, a decision it's contemplating after a gunman used the social network to broadcast a deadly shooting at a New Zealand mosque. 

The company's Chief Operating Officer Sheryl Sandberg said in a blog post Friday that Facebook is "exploring restrictions on who can go Live depending on factors such as prior Community Standard violations."

Facebook, which has rules barring terrorists from the platform, faces mounting pressure to combat hate speech on its platform. The company's live video feature has been used in the past to broadcast suicides, murders and violence.

Live video's dark side came back into the spotlight after March 15, when a gunman killed 50 people at two New Zealand mosques. Facebook pulled down a video of the shooting the gunman had posted, but by then it had already spread to other social media sites and messaging boards.

Facebook found more than 900 different videos that showed parts of the attack, Sandberg said. The tech giant is also trying to improve its technology to flag edited versions of videos and images depicting violence and preventing users from re-sharing them. Facebook, which relies on its 2.3 billion users to flag violent content, also changed its review process to respond to these videos more quickly. 

"While the original New Zealand attack video was shared Live, we know that this video spread mainly through people re-sharing it and re-editing it to make it harder for our systems to block it," Sandberg said.

Facebook has also been taking other steps to combat hate speech. This week, Facebook announced it was banning white nationalist and white separatist content from its platform.