Facebook Allowed Us To Live In Echo Chambers This Election, It’s Up To Us To Break Out


When people feel stunned by a result — especially one that defied statistics — it’s natural to look for explanations. In the days following Donald Trump’s presidential election win, pundits have raced to point fingers at Hillary Clinton, voter turnout, rural America, and, perhaps most surprisingly, at Facebook. Pundits are claiming that the election swung on the website’s “trending” section — which allowed news stories of questionable veracity to spread at lightning speed.

But was Facebook flawed? Or are we just blaming a mirror because we don’t like what we see? Where, precisely, does the obligation of an algorithm end and personal responsibility begin?

Facebook As A Mirror

Really, there’s no better summary of the problem at hand than John Oliver’s look at 2016, which in three minutes sums up the entire argument that Facebook threw the election to Trump by essentially only telling people exactly what they want to hear:

The “filter bubble” theory that Oliver centers this discussion around claims that algorithms like Facebook’s trending page surround us with ideas that are in line with our previously stated (shared) beliefs. Over time, as Facebook and sites like Google build a history of what you click and what you care about, your Facebook feed and your search results naturally begin to skew, instead of a neutral presentation. The Wall Street Journal’s Red Feed, Blue Feed project, which puts conservative and liberal leaning feeds side by side, reveals a striking contrast in how Facebook presents the world based on how it perceives your political views.

Mark Zuckerberg, of course, has made it quite clear that he doesn’t ascribe the “Facebook helped Trumpism” theory. Internally, however, the New York Times has uncovered that his own employees don’t necessarily agree with him, especially since Facebook’s reaction to the problem was allegedly muted by fear of conservative backlash. Some believe that, by catering to what their audience wants to hear instead of what they should hear, they might have influenced the election.

Some employees are worried about the spread of racist and so-called alt-right memes across the network, according to interviews with 10 current and former Facebook employees. Others are asking whether they contributed to a “filter bubble” among users who largely interact with people who share the same beliefs.

To some extent, there is cause for concern. Facebook played host to a cottage industry of pro-Trump proto-nationalist propaganda (though both sides shared memes that were false, remember the “republicans are the dumbest group of voters” thing?).

What’s clearly missing, is a sense of personal responsibility and accountability. As we’ve discussed before, people are not sheep, and, even if they are, that doesn’t obligate Facebook to be the shepherd. The same tools that build our filter bubbles can pop them… if we’re willing to search for fresh viewpoints (or act as our own fact checkers). But who among us does that? How often do we say, “I might not have the whole story here, let me check in with the other side?”

Of course Facebook is changing how we consume news. It’s estimated that nearly a third of all Americans get their news from the site, which spent the summer embroiled in a scandal over its Trending Topics section. Its attempts to make that feature more neutral have largely backfired, something the company is struggling with as they try to work out their role in publishing.

And it’s only going to become more of a problem as Facebook continues to insert itself into publishing. It just bought the popular tool CrowdTangle, which tells publishers which posts from competitors are drawing attention on Facebook and elsewhere. But Facebook doesn’t publish the posts, and it doesn’t force us to read them. We do that, and we need to take responsibility for how we view the world.

Building Echo Chambers

Like it or not, what we read is largely a mirror. All Facebook really allows us to do is to fiddle with that mirror to offer a more flattering, or perhaps more comforting, reflection of ourselves. It allows us to say, “This is my best angle” and then view ourselves exclusively from that vantage point. This is no more new to humanity than the idea of publishing or the concept of spreading news.

We’re all familiar with the phrase “A lie can travel around the world while the truth is still putting on its boots,” attributed to Winston Churchill. Except, as if to underscore the point, Churchill never said it. Neither did Mark Twain. Jonathan Swift wrote it down, but it was likely from somewhere else. Before we had an internet, before radio, before television, we were passing around unattributed quotes as memes and telling ourselves what we wanted to hear, regardless of veracity.

This is a real issue in neuroscience. Scientists have found that humans tend to use the first piece of information they hear to form an opinion. We seek out information that confirms that opinion regardless of its truth-value, and we remember those opinions as better than they were. In many ways, who many Americans were voting for was locked in the first time CNN aired a Clinton speech or the day The Apprentice hit the air. It’s not clear how much, if any, seeking out differing perspectives would change that.

The hand technology plays in this is making the process of self-congratulatory thinking (See, other people agree with me! I’m so smart!) easier, but, again, we pick the articles to share and click. We build the mirror, we chose the like minded friends (sharing similar subject matter) and our clicks build the media — then we blame everyone else when we see parts of ourselves that we don’t like.

Are the consumers of media ready to admit that all media is reliant on our consumption and that we often use our media consumption to feel good about previously held opinions?

Breaking Out Of Echo Chambers

The reality is that leaving the safe confines of the Facebook echo chamber starts with conversation. No one can make somebody think the way they do or force others to agree with stated opinions. But listening is a surefire recipe for new viewpoints (unfortunately Facebook is a platform more for talking at people that with them). At the same time, calls for “empathy” for people who feel differently can ring a little hollow. No one has to justify or nod along with an opinion they find repugnant.

As Facebook’s former product designer Bobby Goodlatte said on election night:

A bias towards truth isn’t an impossible goal. Wikipedia, for instance, still bends towards the truth despite a massive audience. But it’s now clear that democracy suffers if our news environment incentivizes bullshit.

In a way, Facebook has forced us to confront a divided America that years of social justice campaigning couldn’t. The site shoves the beliefs of friends and family right up under our noses. It’s much harder to write off a relative’s racism as the result of a Thanksgiving bender when they say the same things on their feed every day.

At root here is a fundamental flaw of humanity, our collective desire to confront the bad and only listen to what makes us feel better, a flaw that algorithms and curated news feeds are only amplifying. Is it comforting? Sure. But it’s not the truth.

×