Facebook is — among other things — the biggest repository of our Likes, and our dislikes, on the planet. For many, it’s the primary means of curating interests, friendships, and even activities. Every day, with what we click, what we share, and what we hide and unfollow, we tell Zuckerberg & Co. what we want to see. And the company has just announced that it’s taking major steps to show us more of what we want to see (changes to the news feed algorithm are so common, that a press release seems unnecessary at this point).
Which begs the question, is only feeding us exactly what we want to see a good thing?
How A Facebook News Feed Is Built
The new changes more or less boil down to Facebook giving your friends and family more weight on your timeline. The more you Like and interact with a person, the more of their posts Facebook will use to populate your feed, and the higher up those posts will be. Official pages will get less consideration.
Facebook’s official post on the matter makes it difficult not to interpret this change as a direct reaction to the Trending News scuffle of a few months ago. Facebook was accused by some more conservative employees of omitting stories of interest to conservative readers in the Trending section of the site, and omitting any news about Facebook itself altogether.
This caused a firestorm — not least because a shocking number of people aren’t fully aware that Facebook is even curating their News Feed, let alone their Trending stories. A study by the University of Michigan of forty Facebook users found that they often thought instead that their friends and family were hiding stories or deliberately ignoring them. The study is too small to extrapolate outward on, as Facebook has over a billion users across borders, cultures, and belief systems, but it’s fairly safe to assume these forty users aren’t alone in wondering why they don’t see as much from their friends on their News Feeds.
It’s especially a problem because not only do we have no knowledge of what the algorithms privilege and what they don’t, the site’s users have no control over the site itself. Facebook can’t even pretend to be a democracy, and it’s famously cagey about what it’s doing. Without that transparency, there’s simply no way to know what’s going on with your News Feed, and who’s deciding what you see.
The counter-argument is generally that algorithms can’t be biased, because they’re just equations. Yet the use of algorithms to conceal information or present a specific take on a subject dates back to well before there was an internet. In the late 1960s, American Airlines debuted SABRE, an interconnected computer system that would allow ticket agents to sell seats not just on American Airlines routes, but on any flight that was connected to the network. What quickly became clear was that the system put American Airlines flights at the top of the pile, regardless of whether they were the cheapest or the best choice. This set off an antitrust lawsuit that, among other things, had the man who led the charge to build the system admit its job was to sell American Airlines flights, not find the best flights.
Does Facebook favor similar “company first” policies? We have no idea.
To be fair, the site has reversed course in a few respects. For example, news about Facebook tends to be in the Trending section, even when changes to the site, or the actions of founder Mark Zuckerberg, are poorly received. But the whole debate missed a rather important point, that people of disparate mindsets tend to see two very different feeds when they log in. The Wall Street Journal recently assembled a tool called Blue Feed, Red Feed to show how very left-wing and very right-wing Facebook users would view different topics, and the differences are so glaring that they’ll shock even the most jaded observer.