How Invisible Filter Bubbles Shape Your Social, Political, and Religious Views #FiqhOfSocialMedia


During the 2008 election, a 'vicious rumor' was spread that Barack Obama was a Muslim. That's old news. What you may not know is that the number of Americans who hold that belief nearly doubled after the election. More surprisingly, that increase happened mostly among people who are college-educated. Why would supposedly smart people believe something so ludicrous?

The answer is what Eli Pariser calls a 'Filter Bubble' (also the title of his book).

Partisans are more likely to consume news sources that confirm their ideological beliefs. People with more education are more likely to follow political news. Therefore, people with more education can actually become mis-educated. -New Republic

The filter bubble is why Netflix and Amazon know what to recommend to you. It's why Facebook seems to always show you updates that reinforce your existing viewpoints about issues like #BlackLivesMatter, Syria, Colin Kaepernick, or the Kardashians. It's why YouTube shows you ads for Muslim matrimonial sites after you watch an Islamic video, or your Netflix recommendations get messed up after your kids watch cartoons. It's why the trending topics that show on your Facebook feed can differ from your spouse's and create an uncomfortable conversation. And it's why the founder of Facebook famously said,

A squirrel dying in front of your house may be more relevant to your interests right now than people dying in Africa. -Mark Zuckerberg

In short, it explains why people develop more and more extreme opinions online, and no one seems to change their minds about any issue no matter how many articles, videos, memes, or clever status updates you share.

What Exactly Is a Filter Bubble?

It's essentially an algorithm that creates a profile of who you are based on your online activity. Companies like Google and Facebook then use that profile to serve up a personalized newsfeed, search results, advertisements, and other content.

They are prediction engines, constantly creating and refining a theory of who you are and what you'll want to do next. Together, these engines create a unique universe of information for each of us - what I've come to call a filter bubble - which fundamentally alters the way we encounter ideas and information. .... Your identity shapes your media, and your media then shapes what you believe and what you care about. You click on a link, which signals an interest in something, which means you're more likely to see articles about that topic in the future, which in turn prime the topic for you. You become trapped in a you loop, and if your identity is misrepresented, strange patterns begin to emerge, like reverb from an amplifier. -Eli Pariser

Pariser explains this in more detail in his famous Ted Talk.

The allure of the internet was that it removed the gatekeepers. Suddenly we could all be content creators and share our views. This should, theoretically, create a more empathetic and understanding society. You're no longer relying on a newspaper editor or a news producer to shape your opinions for you. In fact, this is the crux of democracy.

Democracy requires citizens to see things from one another's point of view, but instead we're more and more enclosed in our own bubbles. Democracy requires a reliance on shared facts; instead, we're being offered parallel but separate universes. -Eli Pariser

The myth is that the gatekeepers are gone. The reality is that they've simply been replaced by invisible ones.

It's not hard to see how there are numerous consequences ranging issues like privacy, public health (e.g. researching whether to vaccinate your kids or not through a filter bubble), politics, financial problems, social issues, and seeking religious knowledge. In this post, we'll look at some of the broader effects that contribute to those issues.

What Shapes Your Filter Bubble?

One of the greatest criticism of these filtering algorithms is that it is not possible to go out somewhere and retrieve your own profile. In other words, you don't know what identity they have formed about you. There are signals though, that indicate what shapes your online profile.

via Tech Crunch

These algorithms have been the source of controversy as of late.

This quote from Gizmodo from earlier this year highlights the human element and one of the underlying problems.

In other instances, curators would inject a story—even if it wasn’t being widely discussed on Facebook—because it was deemed important for making the network look like a place where people talked about hard news. “People stopped caring about Syria,” one former curator said. “[And] if it wasn’t trending on Facebook, it would make Facebook look bad.” That same curator said the Black Lives Matter movement was also injected into Facebook’s trending news module. “Facebook got a lot of pressure about not having a trending topic for Black Lives Matter,” the individual said. “They realized it was a problem, and they boosted it in the ordering. They gave it preference over other topics. When we injected it, everyone started saying, ‘Yeah, now I’m seeing it as number one’.” This particular injection is especially noteworthy because the #BlackLivesMatter movement originated on Facebook, and the ensuing media coverage of the movement often noted its powerful social media presence. -Gizmodo

The data points that shape your profile number in the hundreds. It is about your location, what kind of computer, web browser, phone, what you search, what you click, what you watch, what you highlight on your Kindle, who your friends are, and so on.

Here's what it looks like in action. The image below shows posts about Barack Obama and it highlights what liberal and conservative outlets are showing. You can generate similar comparisons for topics like guns, abortion, ISIS, and Donald Drumpf by visiting the Wall Street Journal's Red Feed Blue Feed.



Here's another example of the juxtaposition of two different filter bubbles.


You might be wondering where the middle ground is in all of this? The answer is, the middle ground usually disappears.

Filter Bubbles and Friendships

The Prophet Muhammad (saw) said a person can be judged by the religion of their closest friend. That concept takes on a whole new level of meaning beyond just keeping good company online.

We put our opinions on social media with the intent to engage (the magic word for all online interactions). Ideally, we should be sharing our opinions, and understanding others' viewpoints. Some evidence seems to suggest that most people do not change their views because of what they read on social media. Others take it a step further unfriending people because of their views. Many think social media isn't the appropriate place to talk about these issues, and to put it succinctly, there's just a lot of judgment being thrown around back and forth.

For somebody to get up there and run for president and say some of the things that Donald Trump has said, and to not only get media coverage but have people be enthusiastic about it, you couldn’t even imagine before. But we’re in such a divisive society now that people jump onboard these two extremes.

I remember when I was younger and worked with people in the Congress and Senate, I worked on both sides of the aisle with people I thought would make a difference—and always kept it private. But I’ll tell you, they used to get together like 15 years ago and fight like hell on the floor of the Senate and then they’d go have a beer together. They were still friends and felt their principles. Now, if you have lunch or even talk to anyone on the other side, you’re evil. How do you resolve anything when we’re that polarized? -Tony Robbins

The filter bubble creates an intellectual safe space where we retreat from ideas and perspectives at odds with our own. This diminishes our ability to see things from someone else's point of view - i.e. it kills our ability to empathize.

The news feed is designed, in Facebook’s public messaging, to “show people the stories most relevant to them” and ranks stories “so that what’s most important to each person shows up highest in their news feeds.” It is a framework built around personal connections and sharing, where value is both expressed and conferred through the concept of engagement. Of course, engagement, in one form or another, is what media businesses have always sought, and provocation has always sold news. But now the incentives are literalized in buttons and written into software. -New York Times

Part of the issue with social media is the focus on now. We scroll through our feeds quickly liking and commenting on the things we, well, like. This systematically makes us more and more entrenched into our existing viewpoints, and shapes what gets served up to us on our next visit. This is why pages focusing on spreading viral content have millions of fans. And it is why the news is no longer the news.

A great example of this is the Brexit vote. Many of the interviews with the day after the vote showed people in shock. They simply never believed this could happen. And why would they, when every time they opened their phone, it seemed like everyone was against it. Check out this tweet from someone in the "pro-remain" camp.

This is the problem when our activism is reduced to re-sharing witty memes.

From a user’s point of view, every share, like or comment is both an act of speech and an accretive piece of a public identity. Maybe some people want to be identified among their networks as news junkies, news curators or as some sort of objective and well-informed reader. Many more people simply want to share specific beliefs, to tell people what they think or, just as important, what they don’t. A newspaper-style story or a dry, matter-of-fact headline is adequate for this purpose. But even better is a headline, or meme, that skips straight to an ideological conclusion or rebuts an argument. -New York Times

Filter Bubbles and The News

The echo chamber is not just reinforced by your friends and connections, but mass media in general. Just as individuals often do what they're incentivized to do, so do businesses (shocking). In the case of a business, it is to make money - not inform the public. Taking the example of Brexit above, it's much worse than an echo chamber - it is an alternate reality.

Because personalized filters usually have no Zoom Out function, it's easy to lose your bearings, to believe the world is a narrow island when in fact it's an immense, varied continent. -Eli Pariser

Issues important to you might not even register on anyone else's radar. The filter bubble makes it so they never have to see this issue in their feed. Take this from a different angle. What motivation would there be for a new organization to interrupt a Congresswoman speaking about personal privacy for news about Justin Bieber?


This creates a cycle in which our filter bubbles makes news organizations cover issues a certain way. That rhetoric and coverage then begins to inform the political process, and in short, you end up with what we have now. The news will provide whatever the people want to consume.

"If traffic ends up guiding coverage," The Washington Post's ombudsman writes, "will The Post choose not to pursue some important stories because they're 'dull?'" Will an article about, say, child poverty ever seem hugely personally relevant to many of us...? -Eli Pariser

There are a lot of things we want to consume, and a lot of things we should consume. That's the difference between what we binge watch on Netflix versus the documentaries that have been sitting in our queue for months on end. Our bias to the present influences our actions. Important issues will catch a rush of quick publicity, like #Kony2012 or #BringBackOurGirls, and then quickly fade away.

Nuanced and deep thought cannot thrive in this environment. In his book The News: A User's Manual, Alain de Botton writes,

The financial needs of news companies mean that they cannot afford to advance ideas which wouldn't very quickly be able to find favour with enormous numbers of people. .... What levels of agreement, what suppression of idiosyncrasy and useful weirdness, will be required to render material sufficiently palatable to so many...

And when complex issues are covered, they are done so in a shallow manner.

The habit of randomly dipping readers into a brief moment in a lengthy narrative, then rapidly pulling them out again, while failing to provide any explanation of the wider context in which events have been unfolding, is precisely what occurs in the telling of many of the most important stories that run though our societies. -Alain de Botton

For the news to help us tackle these issues, it has to help guide us to the problems, and find ways to develop a common ground to tackle them. The filter bubble, by creating that polarizing effect, instead incites rage. We jump from crisis to crisis. We mimic the same soundbites as the talking heads on TV without any principle.

We are in danger of getting so distracted by the ever-changing agenda of the news that we wind up unable to develop political positions of any kind. We may lose track of which of the many outrages really matters to us and what it was that we felt we cared so passionately about only hours ago. At the very moment when our societies have reached a stage of unparalleled complexity, we have impatiently come to expect all substantial issues to be capable of drastic compression. -Alain de Botton

To make money, you need to get people's attention. To get their attention, you have to simplify things into basic components. By taking complex issues and dumbing them down to the lowest common denominator (i.e. the most amount of traffic), people begin to expect the solutions will be at a congruent level of simplicity. Then when major problems cannot be solved, or others refuse to see things their way, it turns into frustration. Some people take out this aggression by trolling and shame grenades.

Others respond to this frustration by shunning the news and such issues altogether. Their intellect, thought, creativity, and energy goes into other pursuits such as entertainment, sports, and video games. It's simply easier to play fantasy football, watch the games, and track the stats, then it is to immerse yourself into understanding something like why we have issues of systemic racism and poverty. Or understanding the roots of the Palestinian-Israeli conflict.

By design, it is difficult to grasp these subjects.

...confusing, boring, distracting the majority away from politics by presenting events in such a disorganized, fractured and intermittent way that a majority of the audience is unable to hold on to the thread of the most important issues for any length of time.

A contemporary dictator wishing to establish power would not need to do anything so obviously sinister as banning the news: he or she would only have to see to it that news organizations broadcast a flow of random-sounding bulletins, in great numbers but with little explanation of context, within an agenda that kept changing, without giving any sense of the ongoing relevance of an issue that had seemed pressing only a short while before, the whole interspersed with constant updates about the colourful antics of murderers and film stars. .... The status quo could confidently remain forever undisturbed by a flood of, rather than a ban on, news. -Alain de Botton

We change our profile pictures on Facebook to highlight the colors of a flag every few months to make it look like we're woke. It's letting  yourself get taken for a ride by the dictates of someone else and in the end accomplishing nothing at all. It is to retreat into a carefully crafted universe online made just for you, one that defines your own reality, without any context of a larger picture.

It is an axiom of political science in the United States that the only way to neutralize the influence of the newspapers is to multiply their number. -Alexis de Tocqueville

Filter Bubbles and Learning

The biggest trap of the filter bubble is that the further we get into one, the more we think we are learning by depth. In other words, we have a sense of naive realism in that we think all the information is available to us, and therefore the conclusions we make are automatically the most informed ones.

It's like someone saying that just because they have access to all the hadith of the Prophet (s) via the internet, that they have a more informed understanding of the sunnah than scholars from the past. Access to information doesn't create understanding or insight.

Personalized filters can upset this cognitive balance between strengthening our existing ideas and acquiring new ones. First, the filter bubble surrounds us with ideas with which we're already familiar (and already agree), making us overconfident in our mental frameworks. Second, it removes from our environment some of the key prompts that make us want to learn. -Eli Pariser

The red feed blue feed example above highlights this. Consuming information built on a premise we agree with is easy and enjoyable. But consuming information that challenges us to think in new ways, Pariser notes, is frustrating and difficult. The partisan divide grows deeper and deeper. The ironic thing is, educated people tend to consume news more in an effort to stay informed. Thus, they become mis-educated - explaining why more and more college educated people believe Obama wasn't born in the US. The same is true of religious partisanship as well. We often tend to consume information that only comes from a certain school of thought, or only from certain speakers.

We not only form our opinions from the filter bubble, but we become invested in them. Take sports for example. On a close call, people can watch a replay in slow-motion 100 times, and still reach different conclusions about the right call. Everyone has a bias to make the call go in the favor of the team they support. It is even more so when a person is emotionally invested in the team they support.

The more we formulate our opinions out of these bubbles and biases, the more invested we become in them. That makes it that much harder to change.

Experts have a lot invested in the theories they've developed to explain the world. And after a few years of working on them, they tend to see them everywhere. -Eli Pariser

A good example of this is stock analysts not being able to identify the oncoming housing crash. Or career Islamophobes who have literally no incentive to change their mind. Why would they sit down and try to talk and empathize with a Muslim when their filter bubble only exposes them to people who are getting more and more extreme in their hate?

When I was in high school, I took part in speech and debate. One of the greatest learning experiences of that was each year, we were given a topic, and had to learn both sides of it. This meant that you affirmed the topic one round, and went against it one round. The 'case' you ran in support of the topic was often the same for almost an entire year. Yet, if another team ran that same case, you had to be ready to tear it apart. You were forced to learn both sides of the issue in-depth.

Learning occurs when we are presented with an information gap. We have to come across something we don't know or understand. It could be engaging a co-worker on colleague on a topic and having to sit and hear what they have to say rather than shunning the conversation and seeking refuge with like-minded friends on Facebook.

To truly learn, you need what Pariser calls a 'radical encounter.' It's the same way we wish Islamophobes would sit and talk with a Muslim and get to know us. We fail to realize though, that we rarely do this from our end and try to empathize with people we disagree with, or don't like. If we don't have the motivation, why do we expect it from others?

Personalization is about building an environment that consists entirely of the adjacent unknown - the sports trivia or political punctuation marks that don't really shake our schemata but feel like new information. The personalized environment is very good at answering the questions we have but not suggesting questions or problems that are out of our sight altogether. -Eli Pariser

This isn't to say we should always be seeking out the contrarian opinion to everything, but we do need a healthy dose of alternative information to better ground ourselves.

How To Fix the Filter Bubble

The most essential step is simply identifying that you have your own filter bubble.

There are some tactical steps, like what developer BJ May suggests:

  • Find highly active accounts run by people who are wildly dissimilar from me, or who have had wildly dissimilar life experiences. These people must be talking frequently about the issues I hope to understand.
  • I will follow one of these people every day for thirty days, and I will keep following each of them for no less than thirty days, regardless of how much I dislike what they say.
  • I will not engage with the owners of any of these accounts. I will not debate them, I will not argue, I will not interact in any way apart from just reading.
  • I will engage in self-study when I encounter terms or concepts that are foreign to me.

There are also some bigger picture things that need to be done that may not be so systematic that you can put them in a checklist. We all need to seek out conversations with people who differ from us. Different upbringings, backgrounds, ethnicities, and so on. Those conversations need to be intentional about the intent of getting to know and understand someone. You can't empathize with someone if you don't understand their story.

Start making more intentional choices about what to consume. This doesn't mean that you suddenly start watching Fox News for an hour a day, but it might mean diversifying the outlets you follow online to such an extent that there is enough there to challenge you and make you think.

Lastly, we need to stop and reflect. We don't need to just diversify our consumption, but we need to lessen it as well so we can have more time for introspection.

It is never easy to be introspective. There are countless difficult truths lurking within us that investigation threatens to dislodge. It is when we are incubating particularly awkward but potentially vital ideas that we tend to feel most desperate to avoid looking inside. And that is when the news grabs us.

We should be aware of how jealous and adversary of inner examination it is - and how much further it wishes to go in this direction. Its purveyors want to put screens on our seat-backs, receivers in our watches and phones in our minds, so as to ensure that we will always be connected, always aware of what is happening; never alone.

But we will have nothing substantial to offer anyone else so long as we have not first mastered the art of being patient midwives to our own thoughts.  -Alain de Botton

Recommended Reading

The System I Use to Combat FOMO Without Thinking About It


FOMO [Fear of Missing Out] has been one of my biggest ailments in using not just social media - but technology in general. I have to read every single email. When they shut down Google Reader, I spent days figuring out which RSS reader to use  - and then another day subscribing to all the blogs I wanted to follow. Taking a break from work to check Twitter is a common occurrence. Seeing something interesting and getting sucked into a rabbit hole of googling things related to it and watching YouTube videos is also embarrassingly common.

It becomes like your phone. You are doing something useful or productive, but then your time gets stolen by notifications. Someone emails you, then someone texts you, then someone Facebook messages you - it never ends. You get busy being busy and accomplish nothing. Reading things online is the same. There's lots of seemingly valuable and interesting content - but reading that might distract from something else you should be reading that is a better use of time.

So how do you fix it?

Most people try to rely on willpower. I'm going to convince myself not to click on that link and read it. The more click-bait Buzzfeed links you see, the harder this becomes. You start to think you really are missing out on some secret tip, and you won't be able to sleep at night because you'll be up wondering exactly which Friends character you are.

If willpower won't work, you need a system. A system of catching interesting links, saving them, and reading them at an appropriate time. Here is how I handle my incoming stream of information, and how I keep FOMO from running my life.


When there is a lengthy email that you really want to read, but it just keeps getting buried in your inbox, simply boomerang it. This is telling the email to go away and come back after a week or 2. When it comes back, you have a clearer head and you can decide whether you're really going to read it or not. 80% of the time I just archive it, and no longer feel bad about it.

Social Media

Tons of cool stuff comes across from Facebook and Twitter. I use Pocket to track all of it. Your iPhone lets you set up Pocket as something you can officially 'share' to. So anytime there is a link in Twitter, I just automatically send the link to Pocket [use Tweetbot to make this super simple].

For links on Facebook, I simply copy them to the clipboard, and open the Pocket app and it will save the link.

#ProTip - I use Pocket to keep track of YouTube videos I want to watch as well.


I use Feedly to follow blogs of interest. I check this probably every 2 weeks or so. If I see an article that looks interesting, I immediately send it to Pocket.

The benefit of checking it infrequently is that it lets you see which blogs are producing a lot of material you don't read. Then you can safely unsubscribe from them.

Pocket Itself

I'll open up Pocket about 2-3 times a month. By this point I have saved usually upwards of 50 articles and videos and other cool things that I wanted to read and investigate (and get sucked into a rabbit hole of more information). With Pocket now looking so overwhelming, I quickly run through it and assess - what here is actually important? And what am I actually going to read?

This lets me archive out the vast majority of articles, and I read only a select few.

After a month of using this system, I successfully overcame all my fears of FOMO, kept my productive time focused where it needed to be, and still got to check out some of those cool interesting things. It's just that those cool interesting things didn't interrupt the important ones.

The Game Being Played Around You (And that Qur'an Burning Fiasco)


There are two things I came across recently that completely rocked my world in terms of how I understand my interaction with the internet, and social media in general. "Mark Zuckerberg, a journalist was asking him a question about the news feed. And the journalist was asking him, "Why is this so important?" And Zuckerberg said, "A squirrel dying in your front yard may be more relevant to your interests right now than people dying in Africa." And I want to talk about what a Web based on that idea of relevance might look like."

That's the opening to this powerful, 9 minute, Ted Talk about 'online filter bubbles' by Eli Pariser.

I've done some freelance social media consulting on the side, and one of the most frustrating things has been figuring out Facebook. I don't mean the basics, but I mean understanding what Facebook does behind the scenes that makes your post be seen by a fan.

See, on Twitter, when you follow someone, you automatically see everything they post. That's intuitively how it would work. Facebook was kind of the same way. If you "liked" a page, or "friended" someone - you would expect to see their updates. After all, you voluntarily chose to follow them.

But they decided to implement an algorithm to decide for you, whether a post was important or not. A number of factors go into this - how many other people liked it, commented on it, shared it, and how many of the people who did that are friends with you. The only way to get around this is for the one posting it to pay Facebook to 'boost' the post (one obvious motivation for mucking with the system so much).

Google does something similar. If you and a friend both search the same query, chances are you will receive completely different results. It tries to be smart, factoring in things like your location, whether you're on a mobile device, and so on.

Naturally, as you start to consume one type of content, or prefer one viewpoint over another, these algorithms start serving up content that you agree with.

"So it's not just Google and Facebook either. This is something that's sweeping the Web. There are a whole host of companies that are doing this kind of personalization. Yahoo News, the biggest news site on the Internet, is now personalized -- different people get different things. Huffington Post, the Washington Post, the New York Times -- all flirting with personalization in various ways. And this moves us very quickly toward a world in which the Internet is showing us what it thinks we want to see, but not necessarily what we need to see. As Eric Schmidt said, 'It will be very hard for people to watch or consume something that has not in some sense been tailored for them' a broadcast society, there were these gatekeepers, the editors, and they controlled the flows of information. And along came the Internet and it swept them out of the way, and it allowed all of us to connect together, and it was awesome. But that's not actually what's happening right now. What we're seeing is more of a passing of the torch from human gatekeepers to algorithmic ones. And the thing is that the algorithms don't yet have the kind of embedded ethics that the editors did. So if algorithms are going to curate the world for us, if they're going to decide what we get to see and what we don't get to see, then we need to make sure that they're not just keyed to relevance. We need to make sure that they also show us things that are uncomfortable or challenging or important..."

This point made me think, but it wasn't until I read Trust Me I'm Lying by Ryan Holiday that it really hit home at just how deep the manipulation game around us is. Ryan has worked with a number of best-selling authors (like Tim Ferriss), and was the director of marketing for clothing brand American Apparel.

In this book he gives a behind the scenes look at the economics of the blogging industry. This is important because he illustrates how the financial bottom line causes reporting that is unethical, exaggerated, unverified, and sometimes simply fabricated.

One of the examples he used to illustrate this was something most (if not all) of us are familiar with. That crazy guy, Terry Jones, who wanted to do the Qur'an burning. Jones put up billboards in front of his church, and got covered by a small local paper. This was picked up by a small website (Religion News Service). Yahoo then linked to the short article, and in turn a number of other blogs started picking it up. As the story quickly moved up the chain, CNN decided it needed to jump in and interview Jones. The media finally came to its senses, realizing that airing such a video would have potential consequences and decided not to air the video. Jones was under pressure and backed down. Crisis averted.

A few months later, he decided to try again. The media threatened a blackout but he went ahead with the Qur'an burning. About 20 people attended, and no one covered the story. Except one freelance reporter from Agence France Presse. Since AFP is syndicated on Google and Yahoo, the story started to gain publicity. As more and more blogs linked to it, the story got too big to ignore. Now everyone had to comment on it. Within days, 27 people were killed in the ensuing riots in Afghanistan - a very real consequence of Journalism 2.0 resulting in death.

If you've ever been on a news website, you've no doubt been drawn to their 'most emailed' articles list. If you've ever visited Buzzfeed or Upworthy, you notice an overt formula at play with the headlines and article structure. You've seen lists of things split up into annoying slideshows. These are not by accident. Trying to go viral (and get views, traffic, and revenue) is the name of the game. It's the only way to stay alive.

When that is the motivator, people no longer have a motivation to present something valuable, authentic, or even challenging. Instead you get fake news. The best way to explain it is by watching this clip (I hate Southpark, but this is unbelievably illustrative of what actually goes on). In a world of shock and awe viral content, we stop seeking honesty and reality. It's simply too boring.

Holiday goes on to show how a lot of these root elements end up causing things like snarkiness, character assassinations (ever seen that happen to an Islamic scholar online?), and online vigilantism and mob justice.

There used to be a time where we trusted journalism. If something was reported on CNN, or the New York Times, or Time magazine - we assumed there was a journalistic integrity behind it. There was fact-checking, source-checking, and editorial oversight. We can no longer make these assumptions. In fact the yellow journalism of old has simply been reincarnated into the online information diet we each consume daily.

We're sometimes a bit too trusting of everything we see online. We're quick to read and share things without really verifying what happened - but by then it's too late in this ultra-fast information age.

A little bit of pause, caution, and healthy skepticism can go a long way as we peruse online content.

Imagine seeing debates on Facebook, and then as you 'like' the viewpoints you agree with, Facebook stops showing you the opposing viewpoints. Whether it be an Islamic debate, political, or even sports.

Watch the Ted talk and let me know your thoughts. If you can read the book, I highly recommend it (I consider it a 5 star book).

What do we need to teach, particularly from a faith perspective, about understanding this online beast? What issues related to this do you want to understand in more detail, and what type of material would you find helpful? 

You can subscribe to the email list to send me your thoughts or Tweet me.