Facebook Allowed Us To Live In Echo Chambers This Election, It’s Up To Us To Break Out

11.14.16 4 weeks ago 6 Comments

Uproxx

When people feel stunned by a result — especially one that defied statistics — it’s natural to look for explanations. In the days following Donald Trump’s presidential election win, pundits have raced to point fingers at Hillary Clinton, voter turnout, rural America, and, perhaps most surprisingly, at Facebook. Pundits are claiming that the election swung on the website’s “trending” section — which allowed news stories of questionable veracity to spread at lightning speed.

But was Facebook flawed? Or are we just blaming a mirror because we don’t like what we see? Where, precisely, does the obligation of an algorithm end and personal responsibility begin?

Facebook As A Mirror

Really, there’s no better summary of the problem at hand than John Oliver’s look at 2016, which in three minutes sums up the entire argument that Facebook threw the election to Trump by essentially only telling people exactly what they want to hear:

Subscribe to UPROXX

The “filter bubble” theory that Oliver centers this discussion around claims that algorithms like Facebook’s trending page surround us with ideas that are in line with our previously stated (shared) beliefs. Over time, as Facebook and sites like Google build a history of what you click and what you care about, your Facebook feed and your search results naturally begin to skew, instead of a neutral presentation. The Wall Street Journal’s Red Feed, Blue Feed project, which puts conservative and liberal leaning feeds side by side, reveals a striking contrast in how Facebook presents the world based on how it perceives your political views.

Around The Web