Though Facebook's AI-powered censors managed to "mistakenly" flag Zero Hedge as a repeat violator of the social network's "community standards", when it comes to livestreamed videos depicting horrific and extreme violence, the company is still working out some kinks in its ability to immediately identify and remove livestreams depicting horrific acts of terror and violence like the video published Friday by the Christchurch Shooter.
In a blog post mea culpa published Thursday, Facebook's VP of Integrity Guy Rosen explained why the company failed to immediately remove the horrifying livestream of the attacks that the shooter, a 28-year-old Australian who also published a manifesto laying out his violent, islamophobic ideology, posted to Facebook Live, and which was viewed 4,000 times before being taken down.
According to Rosen, one reason the video lingered for so long on its platform - Facebook didn't remove the video until police responding to the incident reached out to the company, despite it being reported multiple times - was that the video wasn't prioritized for immediate review by the company's staff. As it stands, Facebook only prioritizes reported livestreams tagged as suicide or self harm for immediate review.
To rectify this, the company is "reexamining its reporting logic" and will likely expand the report categories prioritized for immediate review.
In Friday’s case, the first user report came in 29 minutes after the broadcast began, 12 minutes after the live broadcast ended. In this report, and a number of subsequent reports, the video was reported for reasons other than suicide and as such it was handled according to different procedures. As a learning from this, we are re-examining our reporting logic and experiences for both live and recently live videos in order to expand the categories that would get to accelerated review.
Or as one twitter wit summed up:
Facebook, 2006: Hey look at this cool new website, keep in touch with your buddies and see what's going o-
— Mike Bird (@Birdyword) March 21, 2019
Facebook, 2019: Yeah we're gonna need a Murder category for video https://t.co/vtP8TnapZR
Also, for anybody wondering why Facebook refuses to implement a time delay on Facebook Live content? Rosen has an answer:
One of the reasons Facebook cites for not putting a time delay on Facebook Live?
— Mark Di Stefano û (@MarkDiStef) March 21, 2019
Police and ambulance will take longer to respond to an emergency being broadcast ON ITS PLATFORM. pic.twitter.com/nevQyicBoU
Rosen also explained how the video managed to circulate even after the original was removed by Facebook. Apparently, a group of "bad actors" working to spread the imagery across the Web managed to capture the video and share it to 8chan and various video sharing platforms.
It was then repackaged by multiple users with slight visual variations (some users took film...