After the election, Facebook and Google promised to fix their algorithms to prevent fake news from spreading. But after the Las Vegas shootings — the first real test of these new algorithms — both sites came up painfully short.
As Engadget detailed, thousands of users complained to both Facebook and Google about stories from questionable news outlets and 4Chan sitting atop both sites for hours. For example, one story from Russian propaganda outlet Sputnik claimed mass murderer Stephen Paddock was a member of ISIS, uncritically running an unlikely claim by the terrorist organization:
Both Facebook and Google have apologized and said they’ll do better. But that doesn’t explain why untrustworthy sources are even tracked by these algorithms in the first place. Differing perspectives on the news are, of course, always going to happen, but these aren’t “different perspectives.” Sputnik is under investigation by the FBI as a propaganda machine, and even if it weren’t, one would think that pushing unsourced claims as news would be enough in of itself to have an outlet booted from any news-tracking algorithm.
Facebook and Google are private companies, and they have a host of options for dealing with these sites, from obliterating their page rank to just simply blocking them altogether. Facebook has even allegedly developed these tools for China. But Facebook and Google would rather blame their algorithms and insist it’s up to users to spot fake news.
That’s simply not good enough. Often, in Silicon Valley, there’s an insistence that all a company does is build a platform and that the company isn’t responsible for how that platform is used. And there’s a degree of moral murk nobody particularly wants to wander into in some cases. The line between fake news and simple shoddy reporting can be a thin one. But once an outlet starts insisting Hillary Clinton is running a child sex slavery ring under a pizza joint, we’re pretty far from that line. Facebook and Google could do more, but that would involve taking responsibility. Until we hold them accountable, it’s going to keep getting dubbed “the fault of their algorithms,” not the people who code them.