Facebook Admits ‘We Need To Do Better’ To Stop Killers From Broadcasting Murders On The Site

“Facebook Killer” Steve Stephens remains at large as the subject of a nationwide manhunt after his phone pinged 100 miles away from Cleveland, where he randomly murdered an elderly man and uploaded the footage to Facebook. Authorities warn that Stephens is armed and dangerous, but in the meantime, much scrutiny has landed upon the social media platform.

Stephens did not live stream his crime (unlike many other recent cases) but uploaded his footage after the fact. He first posted a video about wanting to commit murder before uploading the homicide video itself. And Facebook doesn’t have the resources to vet every video that gets uploaded to their site, so they rely upon user reports. They immediately condemned Stephen’s actions after disabling his account, but Facebook is still facing criticism that they didn’t act quickly enough. And they’re admitting that it’s time to get (more) serious about stopping the broadcast of horrific crimes.

Facebook Global Operations VP Justin Osofsky penned a community post to admit, “We know we need to do better.” He explains the steps that led to the homicide video removal, which was slowed by a lack of user reporting, although there was still a gap between reporting and removal:

[W]e are reviewing our reporting flows to be sure people can report videos and other material that violates our standards as easily and quickly as possible. In this case, we did not receive a report about the first video, and we only received a report about the second video — containing the shooting — more than an hour and 45 minutes after it was posted. We received reports about the third video, containing the man’s live confession, only after it had ended.

We disabled the suspect’s account within 23 minutes of receiving the first report about the murder video, and two hours after receiving a report of any kind. But we know we need to do better.

CNN Money followed up with an anonymous source (with knowledge of Facebook), who says that thousands of reviewers work for the platform, but there’s a typical standard of reviewing flagged content “within 24 hours” as prioritized within an algorithm. With Stephens, the algorithm didn’t work to immediately shut down his account, which may be due to the company’s use of some outside contractors for reviewing.

Still, Facebook has made the commitment to try much harder to act quickly on such videos in the future. Sadly, there will be more instances of people posting footage of their crimes, but perhaps the social media platform can shut things down faster next time.

(Via Facebook & CNN)

×