Should Facebook’s News Feed Only Tell Us What We Want To Hear?

Getty Images/Shutterstock/Facebook

Facebook is — among other things — the biggest repository of our Likes, and our dislikes, on the planet. For many, it’s the primary means of curating interests, friendships, and even activities. Every day, with what we click, what we share, and what we hide and unfollow, we tell Zuckerberg & Co. what we want to see. And the company has just announced that it’s taking major steps to show us more of what we want to see (changes to the news feed algorithm are so common, that a press release seems unnecessary at this point).

Which begs the question, is only feeding us exactly what we want to see a good thing?

How A Facebook News Feed Is Built

The new changes more or less boil down to Facebook giving your friends and family more weight on your timeline. The more you Like and interact with a person, the more of their posts Facebook will use to populate your feed, and the higher up those posts will be. Official pages will get less consideration.

Facebook’s official post on the matter makes it difficult not to interpret this change as a direct reaction to the Trending News scuffle of a few months ago. Facebook was accused by some more conservative employees of omitting stories of interest to conservative readers in the Trending section of the site, and omitting any news about Facebook itself altogether.

This caused a firestorm — not least because a shocking number of people aren’t fully aware that Facebook is even curating their News Feed, let alone their Trending stories. A study by the University of Michigan of forty Facebook users found that they often thought instead that their friends and family were hiding stories or deliberately ignoring them. The study is too small to extrapolate outward on, as Facebook has over a billion users across borders, cultures, and belief systems, but it’s fairly safe to assume these forty users aren’t alone in wondering why they don’t see as much from their friends on their News Feeds.

It’s especially a problem because not only do we have no knowledge of what the algorithms privilege and what they don’t, the site’s users have no control over the site itself. Facebook can’t even pretend to be a democracy, and it’s famously cagey about what it’s doing. Without that transparency, there’s simply no way to know what’s going on with your News Feed, and who’s deciding what you see.

The counter-argument is generally that algorithms can’t be biased, because they’re just equations. Yet the use of algorithms to conceal information or present a specific take on a subject dates back to well before there was an internet. In the late 1960s, American Airlines debuted SABRE, an interconnected computer system that would allow ticket agents to sell seats not just on American Airlines routes, but on any flight that was connected to the network. What quickly became clear was that the system put American Airlines flights at the top of the pile, regardless of whether they were the cheapest or the best choice. This set off an antitrust lawsuit that, among other things, had the man who led the charge to build the system admit its job was to sell American Airlines flights, not find the best flights.

Does Facebook favor similar “company first” policies? We have no idea.

To be fair, the site has reversed course in a few respects. For example, news about Facebook tends to be in the Trending section, even when changes to the site, or the actions of founder Mark Zuckerberg, are poorly received. But the whole debate missed a rather important point, that people of disparate mindsets tend to see two very different feeds when they log in. The Wall Street Journal recently assembled a tool called Blue Feed, Red Feed to show how very left-wing and very right-wing Facebook users would view different topics, and the differences are so glaring that they’ll shock even the most jaded observer.

Facebook Is Not Objective, And Doesn’t Have To Be

It’s easy to forget, because we don’t give it money, that Facebook is a business, and we’re the product. The more time we spend on our Walls, reading our Feeds, and sending messages to our friends, the more data Facebook can collect about us and the more targeted ads the company can sell. So its motives are ultimately simple, to keep us reading and generate ad revenue. It might want to seem egalitarian, but that’s only because that illusion is a part of the brand. In truth, the algorithm is tilted towards what we want to hear and will keep looking at. It allows us to be insulated and validated — what some social theorists call a filter bubble.

The downside is, as algorithms improve, we’ll become more and more clueless about the perspectives of other people, and more and more hostile to the idea that there’s any view of the world other than our own (we’ve all witnessed friends announcing, “It’s my wall, don’t argue with me!”). Of course, this myopia is dangerous. Bernie Sanders called out his own supporters for this behavior, and we’ve seen how something as innocuous as video games can turn into a culture war with deeply entrenched sides and real consequences.

The filter bubble theory depends on the idea of human beings as sheep, and Facebook as a shepherd. Which may seem true — but in reality, human beings are too complex and messy to be sheep and Facebook’s algorithms are too simplistic to herd us.

Facebook Isn’t Mind Control

Tackling the problem of Facebook as intellectual shepherd first, we’ve been talking about the company’s algorithm as if it’s a concise and brilliant tool to figure out exactly who you are and sell things to you. The reality is much different; it’s a work in progress and often gets things terribly wrong. Part of the reason Facebook is giving your friends and family so much weight is that it’s a fairly safe guess you’ll want to see your sister’s new baby more than you want to see the latest news on Donald Trump or Justin Bieber. But even that depends on your personal Facebook philosophy. Do you use the platform as a way to stay connected to people you love, or a way to engage in cultural events?

Facebook’s algorithm can only make decisions based on what you tell it. It can spot trends, sure, and make inferences. If you follow a page or say you like a certain band, that’s useful information. But people are messy and algorithms can’t clean them up. Worse, at least from an algorithmic perspective, they can change over time, sometimes dramatically. And unless you tell Facebook absolutely everything, which you likely don’t, it’s working, at best, on a limited perspective of who you are and what you want.

The more toxic part of the filter bubble equation, though, is the idea of people as mindless sheep. That’s an idea with a lot of currency, across all sides of the social/political spectrum (is there any side of any argument that hasn’t employed “wake up, sheeple!”?). It’s easier for us to think of the other side as a bunch of bleating farm animals while we’re the ones with a masterful grasp of logical thinking, but the reality is, every single one of us has biases rooted deep in our brains.

For example, when we make a decision, we tend to rely on the first pieces of information we get on that topic, whether it’s a brand of fruit juice or a Presidential candidate, a cognitive bias called anchoring. But what if that information is proven wrong? We cling to it anyway! Our brains tend to treat our opinions as if they were physical objects, not abstracts, and proving those opinions are incorrect makes us cling to them all the tighter, something cognitive scientists call the backfire effect. Think you can train yourself out of this behavior? Not likely. Scientists are so bad at being objective in their work, they’ve got their own separate cognitive bias for it.

In fact, we are staggeringly blind to our own biases, no matter how innocuous they may be, and our brains tend to reinforce the idea that we made good decisions, no matter how objectively bad for us those decisions were. In fact, we can’t even make a choice objectively, as when we compare two choices, we tend to view them as more different than similar.

Which is all just to say, that the idea that an equation could properly navigate the maze that is the human mind is, to some degree, science fiction. We might fall prey to groupthink and biases and routinized thought processes like sheep, but we’re still too unruly to allow a shepherd to run the show. Facebook’s job is to try and figure out what one billion human beings — each a giant pile of biases, experiences, and emotions — actually wants to see at any given moment. A little button on a webpage could no more express or sum up a human than a toddler could perform heart surgery, no matter how many times we hammer at it or how many emoji Facebook crams into it.

The Opinions Are Still There, If We Want To Find Them

If we aren’t going to be sheep, we can’t shirk the very human quality of accountability. The simple truth is that Facebook not only can’t be our intellectual babysitter, it shouldn’t be. An algorithm can no more evaluate our beliefs and opinions and find our intellectual blinds spots than it can reveal our emotional complexity through a collection of “likes.” The only way we can uncover the gaps in our reasoning is, essentially, to fall into them and take it as a growing experience, not somebody else’s fault. That’s tough to do. Our brains aren’t really equipped to do it very well. But it’s all we’ve got.

We sit in filter bubbles on social media because we choose to live in them. Facebook isn’t trapping us, or controlling us, or making us think things. It’s simply allowing us to revel in our own thought process, validating them with like minded people and news outlets.

We’re making choices, and if we don’t like them, we do have the ability to make different ones.

×