In Trying To Fix Its Trending Topics, Facebook Reveals Some Flaws

Shutterstock/Getty Image

Facebook users noted a sudden change on the site recently. Where formerly the site’s trending topics had small blurbs explaining why they were trending and carefully curated links, suddenly the blurbs were gone, with just the trending term in place. Facebook has, suddenly and without warning, changed how 30% of Americans get their news, and unintentionally revealed how little they know about their own users in the process.

On Friday, a few days after the change was visible, Facebook announced changes that, in their words, “will make the product more automated.” Algorithms would now find and source articles and terms, not humans. Despite the tone of the press release, this appears to have been an abrupt change, as the team writing those blurbs was fired with an hour’s notice.

This is the end result of months of controversy surrounding Facebook’s trending topics. In May the site was accused of liberal bias by some former employees, and ever since, it’s been tinkering with the trending bar formula, skewing news posts its users see more towards what it believes their political opinions to be and prioritizing shares and posts from friends and family over the pages of news organs and opinion sites. The algorithm takeover appears to be what the site hopes is the final stroke. The problem, of course, is that it’s uncovered new issues for Facebook.

News As “Automated Product”

Letting robots decide what’s news has already become something of a disaster for Facebook. For example, clicking on the Justice League trending topic should take you to news that the film has revealed a new major villain. Instead, robots insist the top headline is last week’s news that Doug Liman would be directing Justice League Dark, before directly contradicting itself within its own feed. Nor is this the only problem. The algorithm has barely been functioning for a few days, and already it’s been fooled by a false news story about Megyn Kelly’s firing from Fox News. And this was, perhaps, more predictable than Facebook cares to admit.

Last week, the New York Times explained how to uncover what Facebook believes about each user’s political opinions. If you go to your ad preferences, you can see not just what it thinks of your politics, but what it thinks you like about everything. To give you an idea of just how off this is, this is what Facebook presents as the “hobbies and interests” of an anonymous Uproxx writer:

This appears to largely be based on posts the user shares and what they write about. But Facebook’s advertising algorithm, the backbone of its financial security, appears unable to tell the difference between a cult TV show and the insect it shares a name with. In other words, Facebook has just revealed that its vaunted knowledge of everything you do and share is crippled by the fact that it has no idea how to actually sort and apply that data. And this is a bigger problem than you might realize.

Algorithms Vs. Humans

As we mentioned, 30% of Americans get their news from Facebook, an attention-getting number that’s changing how news is presented to us and how we consume it. True, it doesn’t matter much when an algorithm mistakes last week’s superhero movie news with this week’s, but as we’ve noted before, algorithms are only as good as the data that goes into them, and they reflect the biases and foibles of the human beings who program them. Then consider we’re in an election year, and all the gossip and rumormongering that goes along with such an event.

To be fair to Facebook, they are in uncharted waters. Nothing like Facebook — a mixture of town hall, news site, and entertainment complex — has ever existed in human history, at least on the scale the site commands. But it’s clear that unless their algorithms improve quickly, that humans should be in charge of the news you see.

×