During the 2008 election, a 'vicious rumor' was spread that Barack Obama was a Muslim. That's old news. What you may not know is that the number of Americans who hold that belief nearly doubled after the election. More surprisingly, that increase happened mostly among people who are college-educated. Why would supposedly smart people believe something so ludicrous?
The answer is what Eli Pariser calls a 'Filter Bubble' (also the title of his book).
Partisans are more likely to consume news sources that confirm their ideological beliefs. People with more education are more likely to follow political news. Therefore, people with more education can actually become mis-educated. -New Republic
The filter bubble is why Netflix and Amazon know what to recommend to you. It's why Facebook seems to always show you updates that reinforce your existing viewpoints about issues like #BlackLivesMatter, Syria, Colin Kaepernick, or the Kardashians. It's why YouTube shows you ads for Muslim matrimonial sites after you watch an Islamic video, or your Netflix recommendations get messed up after your kids watch cartoons. It's why the trending topics that show on your Facebook feed can differ from your spouse's and create an uncomfortable conversation. And it's why the founder of Facebook famously said,
A squirrel dying in front of your house may be more relevant to your interests right now than people dying in Africa. -Mark Zuckerberg
In short, it explains why people develop more and more extreme opinions online, and no one seems to change their minds about any issue no matter how many articles, videos, memes, or clever status updates you share.
What Exactly Is a Filter Bubble?
It's essentially an algorithm that creates a profile of who you are based on your online activity. Companies like Google and Facebook then use that profile to serve up a personalized newsfeed, search results, advertisements, and other content.
They are prediction engines, constantly creating and refining a theory of who you are and what you'll want to do next. Together, these engines create a unique universe of information for each of us - what I've come to call a filter bubble - which fundamentally alters the way we encounter ideas and information. .... Your identity shapes your media, and your media then shapes what you believe and what you care about. You click on a link, which signals an interest in something, which means you're more likely to see articles about that topic in the future, which in turn prime the topic for you. You become trapped in a you loop, and if your identity is misrepresented, strange patterns begin to emerge, like reverb from an amplifier. -Eli Pariser
Pariser explains this in more detail in his famous Ted Talk.
The allure of the internet was that it removed the gatekeepers. Suddenly we could all be content creators and share our views. This should, theoretically, create a more empathetic and understanding society. You're no longer relying on a newspaper editor or a news producer to shape your opinions for you. In fact, this is the crux of democracy.
Democracy requires citizens to see things from one another's point of view, but instead we're more and more enclosed in our own bubbles. Democracy requires a reliance on shared facts; instead, we're being offered parallel but separate universes. -Eli Pariser
The myth is that the gatekeepers are gone. The reality is that they've simply been replaced by invisible ones.
It's not hard to see how there are numerous consequences ranging issues like privacy, public health (e.g. researching whether to vaccinate your kids or not through a filter bubble), politics, financial problems, social issues, and seeking religious knowledge. In this post, we'll look at some of the broader effects that contribute to those issues.
What Shapes Your Filter Bubble?
One of the greatest criticism of these filtering algorithms is that it is not possible to go out somewhere and retrieve your own profile. In other words, you don't know what identity they have formed about you. There are signals though, that indicate what shapes your online profile.
These algorithms have been the source of controversy as of late.
This quote from Gizmodo from earlier this year highlights the human element and one of the underlying problems.
In other instances, curators would inject a story—even if it wasn’t being widely discussed on Facebook—because it was deemed important for making the network look like a place where people talked about hard news. “People stopped caring about Syria,” one former curator said. “[And] if it wasn’t trending on Facebook, it would make Facebook look bad.” That same curator said the Black Lives Matter movement was also injected into Facebook’s trending news module. “Facebook got a lot of pressure about not having a trending topic for Black Lives Matter,” the individual said. “They realized it was a problem, and they boosted it in the ordering. They gave it preference over other topics. When we injected it, everyone started saying, ‘Yeah, now I’m seeing it as number one’.” This particular injection is especially noteworthy because the #BlackLivesMatter movement originated on Facebook, and the ensuing media coverage of the movement often noted its powerful social media presence. -Gizmodo
The data points that shape your profile number in the hundreds. It is about your location, what kind of computer, web browser, phone, what you search, what you click, what you watch, what you highlight on your Kindle, who your friends are, and so on.
Here's what it looks like in action. The image below shows posts about Barack Obama and it highlights what liberal and conservative outlets are showing. You can generate similar comparisons for topics like guns, abortion, ISIS, and Donald Drumpf by visiting the Wall Street Journal's Red Feed Blue Feed.
Here's another example of the juxtaposition of two different filter bubbles.
You might be wondering where the middle ground is in all of this? The answer is, the middle ground usually disappears.
Filter Bubbles and Friendships
The Prophet Muhammad (saw) said a person can be judged by the religion of their closest friend. That concept takes on a whole new level of meaning beyond just keeping good company online.
We put our opinions on social media with the intent to engage (the magic word for all online interactions). Ideally, we should be sharing our opinions, and understanding others' viewpoints. Some evidence seems to suggest that most people do not change their views because of what they read on social media. Others take it a step further unfriending people because of their views. Many think social media isn't the appropriate place to talk about these issues, and to put it succinctly, there's just a lot of judgment being thrown around back and forth.
For somebody to get up there and run for president and say some of the things that Donald Trump has said, and to not only get media coverage but have people be enthusiastic about it, you couldn’t even imagine before. But we’re in such a divisive society now that people jump onboard these two extremes.
I remember when I was younger and worked with people in the Congress and Senate, I worked on both sides of the aisle with people I thought would make a difference—and always kept it private. But I’ll tell you, they used to get together like 15 years ago and fight like hell on the floor of the Senate and then they’d go have a beer together. They were still friends and felt their principles. Now, if you have lunch or even talk to anyone on the other side, you’re evil. How do you resolve anything when we’re that polarized? -Tony Robbins
The filter bubble creates an intellectual safe space where we retreat from ideas and perspectives at odds with our own. This diminishes our ability to see things from someone else's point of view - i.e. it kills our ability to empathize.
The news feed is designed, in Facebook’s public messaging, to “show people the stories most relevant to them” and ranks stories “so that what’s most important to each person shows up highest in their news feeds.” It is a framework built around personal connections and sharing, where value is both expressed and conferred through the concept of engagement. Of course, engagement, in one form or another, is what media businesses have always sought, and provocation has always sold news. But now the incentives are literalized in buttons and written into software. -New York Times
Part of the issue with social media is the focus on now. We scroll through our feeds quickly liking and commenting on the things we, well, like. This systematically makes us more and more entrenched into our existing viewpoints, and shapes what gets served up to us on our next visit. This is why pages focusing on spreading viral content have millions of fans. And it is why the news is no longer the news.
A great example of this is the Brexit vote. Many of the interviews with the day after the vote showed people in shock. They simply never believed this could happen. And why would they, when every time they opened their phone, it seemed like everyone was against it. Check out this tweet from someone in the "pro-remain" camp.
This is the problem when our activism is reduced to re-sharing witty memes.
From a user’s point of view, every share, like or comment is both an act of speech and an accretive piece of a public identity. Maybe some people want to be identified among their networks as news junkies, news curators or as some sort of objective and well-informed reader. Many more people simply want to share specific beliefs, to tell people what they think or, just as important, what they don’t. A newspaper-style story or a dry, matter-of-fact headline is adequate for this purpose. But even better is a headline, or meme, that skips straight to an ideological conclusion or rebuts an argument. -New York Times
Filter Bubbles and The News
The echo chamber is not just reinforced by your friends and connections, but mass media in general. Just as individuals often do what they're incentivized to do, so do businesses (shocking). In the case of a business, it is to make money - not inform the public. Taking the example of Brexit above, it's much worse than an echo chamber - it is an alternate reality.
Because personalized filters usually have no Zoom Out function, it's easy to lose your bearings, to believe the world is a narrow island when in fact it's an immense, varied continent. -Eli Pariser
Issues important to you might not even register on anyone else's radar. The filter bubble makes it so they never have to see this issue in their feed. Take this from a different angle. What motivation would there be for a new organization to interrupt a Congresswoman speaking about personal privacy for news about Justin Bieber?
This creates a cycle in which our filter bubbles makes news organizations cover issues a certain way. That rhetoric and coverage then begins to inform the political process, and in short, you end up with what we have now. The news will provide whatever the people want to consume.
"If traffic ends up guiding coverage," The Washington Post's ombudsman writes, "will The Post choose not to pursue some important stories because they're 'dull?'" Will an article about, say, child poverty ever seem hugely personally relevant to many of us...? -Eli Pariser
There are a lot of things we want to consume, and a lot of things we should consume. That's the difference between what we binge watch on Netflix versus the documentaries that have been sitting in our queue for months on end. Our bias to the present influences our actions. Important issues will catch a rush of quick publicity, like #Kony2012 or #BringBackOurGirls, and then quickly fade away.
Nuanced and deep thought cannot thrive in this environment. In his book The News: A User's Manual, Alain de Botton writes,
The financial needs of news companies mean that they cannot afford to advance ideas which wouldn't very quickly be able to find favour with enormous numbers of people. .... What levels of agreement, what suppression of idiosyncrasy and useful weirdness, will be required to render material sufficiently palatable to so many...
And when complex issues are covered, they are done so in a shallow manner.
The habit of randomly dipping readers into a brief moment in a lengthy narrative, then rapidly pulling them out again, while failing to provide any explanation of the wider context in which events have been unfolding, is precisely what occurs in the telling of many of the most important stories that run though our societies. -Alain de Botton
For the news to help us tackle these issues, it has to help guide us to the problems, and find ways to develop a common ground to tackle them. The filter bubble, by creating that polarizing effect, instead incites rage. We jump from crisis to crisis. We mimic the same soundbites as the talking heads on TV without any principle.
We are in danger of getting so distracted by the ever-changing agenda of the news that we wind up unable to develop political positions of any kind. We may lose track of which of the many outrages really matters to us and what it was that we felt we cared so passionately about only hours ago. At the very moment when our societies have reached a stage of unparalleled complexity, we have impatiently come to expect all substantial issues to be capable of drastic compression. -Alain de Botton
To make money, you need to get people's attention. To get their attention, you have to simplify things into basic components. By taking complex issues and dumbing them down to the lowest common denominator (i.e. the most amount of traffic), people begin to expect the solutions will be at a congruent level of simplicity. Then when major problems cannot be solved, or others refuse to see things their way, it turns into frustration. Some people take out this aggression by trolling and shame grenades.
Others respond to this frustration by shunning the news and such issues altogether. Their intellect, thought, creativity, and energy goes into other pursuits such as entertainment, sports, and video games. It's simply easier to play fantasy football, watch the games, and track the stats, then it is to immerse yourself into understanding something like why we have issues of systemic racism and poverty. Or understanding the roots of the Palestinian-Israeli conflict.
By design, it is difficult to grasp these subjects.
...confusing, boring, distracting the majority away from politics by presenting events in such a disorganized, fractured and intermittent way that a majority of the audience is unable to hold on to the thread of the most important issues for any length of time.
A contemporary dictator wishing to establish power would not need to do anything so obviously sinister as banning the news: he or she would only have to see to it that news organizations broadcast a flow of random-sounding bulletins, in great numbers but with little explanation of context, within an agenda that kept changing, without giving any sense of the ongoing relevance of an issue that had seemed pressing only a short while before, the whole interspersed with constant updates about the colourful antics of murderers and film stars. .... The status quo could confidently remain forever undisturbed by a flood of, rather than a ban on, news. -Alain de Botton
We change our profile pictures on Facebook to highlight the colors of a flag every few months to make it look like we're woke. It's letting yourself get taken for a ride by the dictates of someone else and in the end accomplishing nothing at all. It is to retreat into a carefully crafted universe online made just for you, one that defines your own reality, without any context of a larger picture.
It is an axiom of political science in the United States that the only way to neutralize the influence of the newspapers is to multiply their number. -Alexis de Tocqueville
Filter Bubbles and Learning
The biggest trap of the filter bubble is that the further we get into one, the more we think we are learning by depth. In other words, we have a sense of naive realism in that we think all the information is available to us, and therefore the conclusions we make are automatically the most informed ones.
It's like someone saying that just because they have access to all the hadith of the Prophet (s) via the internet, that they have a more informed understanding of the sunnah than scholars from the past. Access to information doesn't create understanding or insight.
Personalized filters can upset this cognitive balance between strengthening our existing ideas and acquiring new ones. First, the filter bubble surrounds us with ideas with which we're already familiar (and already agree), making us overconfident in our mental frameworks. Second, it removes from our environment some of the key prompts that make us want to learn. -Eli Pariser
The red feed blue feed example above highlights this. Consuming information built on a premise we agree with is easy and enjoyable. But consuming information that challenges us to think in new ways, Pariser notes, is frustrating and difficult. The partisan divide grows deeper and deeper. The ironic thing is, educated people tend to consume news more in an effort to stay informed. Thus, they become mis-educated - explaining why more and more college educated people believe Obama wasn't born in the US. The same is true of religious partisanship as well. We often tend to consume information that only comes from a certain school of thought, or only from certain speakers.
We not only form our opinions from the filter bubble, but we become invested in them. Take sports for example. On a close call, people can watch a replay in slow-motion 100 times, and still reach different conclusions about the right call. Everyone has a bias to make the call go in the favor of the team they support. It is even more so when a person is emotionally invested in the team they support.
The more we formulate our opinions out of these bubbles and biases, the more invested we become in them. That makes it that much harder to change.
Experts have a lot invested in the theories they've developed to explain the world. And after a few years of working on them, they tend to see them everywhere. -Eli Pariser
A good example of this is stock analysts not being able to identify the oncoming housing crash. Or career Islamophobes who have literally no incentive to change their mind. Why would they sit down and try to talk and empathize with a Muslim when their filter bubble only exposes them to people who are getting more and more extreme in their hate?
When I was in high school, I took part in speech and debate. One of the greatest learning experiences of that was each year, we were given a topic, and had to learn both sides of it. This meant that you affirmed the topic one round, and went against it one round. The 'case' you ran in support of the topic was often the same for almost an entire year. Yet, if another team ran that same case, you had to be ready to tear it apart. You were forced to learn both sides of the issue in-depth.
Learning occurs when we are presented with an information gap. We have to come across something we don't know or understand. It could be engaging a co-worker on colleague on a topic and having to sit and hear what they have to say rather than shunning the conversation and seeking refuge with like-minded friends on Facebook.
To truly learn, you need what Pariser calls a 'radical encounter.' It's the same way we wish Islamophobes would sit and talk with a Muslim and get to know us. We fail to realize though, that we rarely do this from our end and try to empathize with people we disagree with, or don't like. If we don't have the motivation, why do we expect it from others?
Personalization is about building an environment that consists entirely of the adjacent unknown - the sports trivia or political punctuation marks that don't really shake our schemata but feel like new information. The personalized environment is very good at answering the questions we have but not suggesting questions or problems that are out of our sight altogether. -Eli Pariser
This isn't to say we should always be seeking out the contrarian opinion to everything, but we do need a healthy dose of alternative information to better ground ourselves.
How To Fix the Filter Bubble
The most essential step is simply identifying that you have your own filter bubble.
There are some tactical steps, like what developer BJ May suggests:
- Find highly active accounts run by people who are wildly dissimilar from me, or who have had wildly dissimilar life experiences. These people must be talking frequently about the issues I hope to understand.
- I will follow one of these people every day for thirty days, and I will keep following each of them for no less than thirty days, regardless of how much I dislike what they say.
- I will not engage with the owners of any of these accounts. I will not debate them, I will not argue, I will not interact in any way apart from just reading.
- I will engage in self-study when I encounter terms or concepts that are foreign to me.
There are also some bigger picture things that need to be done that may not be so systematic that you can put them in a checklist. We all need to seek out conversations with people who differ from us. Different upbringings, backgrounds, ethnicities, and so on. Those conversations need to be intentional about the intent of getting to know and understand someone. You can't empathize with someone if you don't understand their story.
Start making more intentional choices about what to consume. This doesn't mean that you suddenly start watching Fox News for an hour a day, but it might mean diversifying the outlets you follow online to such an extent that there is enough there to challenge you and make you think.
Lastly, we need to stop and reflect. We don't need to just diversify our consumption, but we need to lessen it as well so we can have more time for introspection.
It is never easy to be introspective. There are countless difficult truths lurking within us that investigation threatens to dislodge. It is when we are incubating particularly awkward but potentially vital ideas that we tend to feel most desperate to avoid looking inside. And that is when the news grabs us.
We should be aware of how jealous and adversary of inner examination it is - and how much further it wishes to go in this direction. Its purveyors want to put screens on our seat-backs, receivers in our watches and phones in our minds, so as to ensure that we will always be connected, always aware of what is happening; never alone.
But we will have nothing substantial to offer anyone else so long as we have not first mastered the art of being patient midwives to our own thoughts. -Alain de Botton
- Book: Filter Bubbles by Eli Pariser
- Book: News by Alain de Botton
- Inside Facebook’s (Totally Insane, Unintentionally Gigantic, Hyperpartisan) Political-Media Machine (NY Times)
- 26 Tweets That Broke My Filter Bubble
- 3 Prophetic Friendship Principles for the Social Media Age
- Social Media Activism: A Real Thing, Or A Trick We Play On Ourselves?
- #BringBackOurSanity Guide to Internet Debates
- Dua: The Greatest Casualty of a Socially Networked Life
- The Game Being Played Around You
- Bread & Circuses