The Sunny Side of the Facebook Bubble

The Sunny Side of the Facebook bubble

Source: creative commons

Source: creative commons

Related article: Confronting Myself

The Sunny Side of the Facebook Bubble

By Werner van Rossum

Ever since the outcome of the United States’ 2016 election, my attention has been drawn to a new concept introduced by news programs and other political media: they call it the filter bubble. Because of the filter bubble, we become blind to many aspects of society as it is, and so it could happen that we were suddenly confronted with the real world. Trump won the election, Brexit happened, right-wing populism in Europe is on the rise, etcetera. The first time I heard about the term myself was while watching The Daily Show, but this was probably not the first mention of the term, which has become a very popular topic of analysis and debate over the last few months.

Filter bubble-blindness is a result of two things: firstly, an algorithm personalizes everyone’s Facebook feed and Google search results on the basis of their prior Internet activities, likes, and clicks. Secondly, Facebook is rapidly becoming the most popular place for people to consume their news: it is estimated that about 62% of Americans get their information about the world from their newsfeed. With these characteristics combined, the filter algorithm re-enforces every Facebook user’s pre-existing beliefs, and so creates a very biased view of the world as a whole: because of the creations of Mark Zuckerberg and friends, our worldview becomes one-dimensional and polarized. Also, their effect seems to be virtually inescapable if you’re an active Internet user. And this would be a problem for any political endeavor, as it would be misinformed and one-sided, making political debate polarized and antagonistic instead of cooperative. 

I’ve been thinking about a way to view this whole thing differently. And so, today I would like to present the result: I argue that the filter bubble can actually be very helpful as a detection tool for our own preconceptions. Given that preconceptions and colored world views are inescapable side effects of being human, we should embrace the Facebook bubble as a “cartoon version” of our own preconceptions that, similar to a cartoon, amplifies our most prominent features and tendencies to an extreme: when dealt with properly, I think it could help us be more nuanced, more self-reflective and more politically communicative.

You’re in the bubble, whether you like it or not

So, the word “bubble” has been used to describe the Facebook algorithm effect: there is a computer program at work that decides which posts make it to your personal news feed: once you click on a thing about cats, the algorithm will make sure you receive more posts about cats, as well as other posts from the same source as where the cats came from. At the same time, you will receive less from everything else.

Applied to politics, once you start to read and like news sources that have a particular agenda, you will receive more news following the same agenda, and the multiplicity of sources will make it seem as if “this is what the world looks like.”

source: creative commons

source: creative commons

Some experiments have been carried out done between liberals and conservatives, for example by swapping news feeds, and they could not believe the difference between their worlds. Instead of just having an opinion, everyone now has an opinion that is enforced by all the news sources they access and all their friends’ posts. On Facebook, you’re basically in a self-confirming feedback loop of your own beliefs.

However, the word “bubble” is not exclusively Internet-related: it can be used to describe any insulated system of beliefs. This doesn’t just happen on Facebook. The inescapability of having a viewpoint that pretty much determines what you’ll see, what you notice, and what you ignore, is a fundamental point of postmodernist philosophy. Indeed, Nietzsche was on to something when he famously wrote: “There are no facts, only interpretations.” In more recent times, researchers have discovered a psychological concept called the confirmation bias: we’re more inclined to believe the sort of things that validate what we want to believe. This bias is of course one of the reasons the Facebook newsfeed is so effective. Of course, all this doesn’t mean our worldviews don’t change. It just means that we live among other people and are not as autonomous and rational in our beliefs as we might think we are (especially if we are Enlightenment-inspired, analytical philosophers).

The usefulness of our Facebook feeds

But hey: have you noticed how one-dimensional, over the top, simple, repetitive, and basically dumb the algorithm determining your news feed is? To me it seems that it only acts directly on the first association, recommending me news articles about the thing I have just read an article about; it gives me every single one of my friends’ posts containing the word “gender” because I hovered over one those articles slightly longer; it suddenly chooses to only show me pages about home electronics, simply because I googled about a vacuum cleaner once.

On political subjects it seems to provide a cartoon portrait of my own preconceptions: in the same way a cartoon artist would disproportionately enlarge my nose or chin and reduce the size of my eyes, the Facebook feed amplifies my moderate convictions to grotesque extremes I no longer identify with.

However, in the same way that I would react to the cartoon artist - where I would have to admit that, indeed, my eyes are slightly smaller than is common - the Facebook feed indicates which tendencies I have in terms of conviction, “where my beliefs are going to lead to”, taken to extremes. However, there is of course a difference. I’ll have to live with the face I was born with; beliefs I can change consciously, and tendencies I can work on.

Viewed this way, my Facebook feed can help me direct attention to certain aspects of myself. It doesn’t necessarily help me in a direct sense, but it forces me to self-reflect and make up my mind: these feminist posts are starting to sound quite aggressive, and do I really have 9/11-truther-like tendencies, as this conspiracy theory-article just popped up? 

In this sense, Facebook provides us with a detection mechanism. It’s a constant and tangible reminder of our own preconceptions. The algorithms that are at work are not subtle or ‘intelligent’ in the sense that you don’t notice them at work. In fact, their effects are very obvious, and so they draw our attention to what we like and think, often in a more exaggerated way than we perceive ourselves to be. This dissonance between our ‘real’ conceptions and the cartoonish Facebook version of them can be very helpful in making our own inclinations and preconceptions visible and tangible, and through that, making them object of analysis and change. They facilitate self-criticism in a new way. 

source: creative commons

source: creative commons


We have to be willing to confront ourselves, too

I started this article promising that the Facebook newsfeed would help us be more nuanced, more self-reflective and more politically communicative. Of course it will not do so by itself: the blindness described in all informational videos about the bubble is real, and Trump did become the president of the United States. It means we’ll have to work for it. In my opinion, it is important to always be aware of two things:

1. When we are on Facebook, WE are on Facebook: what we see is not the world, but ourselves.

2. We will always, inevitably, be biased, no matter how hard we try. In fact, the conviction of not being biased is probably the worst of them all.

In short: we will have to be eager to grow, learn, listen, and take positive action while at the same time willing to confront ourselves with our own biases and misinformed convictions. Of course this is not an easy task. But judging from the fact that this article made it to your news feed, I think we might have a shot.


               ||  Keep exploring our Confrontation issue by clicking on content below  ||