Let’s play a game! Everyone likes movies, so that will be the topic, and we shall name it… Three movies, one common factor. The rules are simple: I give you three cuts from three famous classics, and your task is to name one thing they all have in common. Okay? Ready, set, go!

“Luke, I am your father”

“Run, you fools”

“Mirror, Mirror on the wall, who’s the fairest of them all?”
So what’s your answer? I’m sure there is a gazillion number of common factors, imagination is the only limit. However, there is one in particular that I had in mind. That is, all the above movie quotes are incorrect: Darth Vader never says – “Luke, I am your father”; Gandalf – “Run, you fools”; and the Evil Queen – “Mirror, Mirror on the wall, who’s the fairest of them all?”. The actual quotes are: “No, I am your father“; “Fly, you fools“; and “Magic mirror on the wall, who’s the fairest of them all? “.
They are all widely known, but somehow most of us got it wrong. This specific phenomenon is known as The Mandela Effect — false memory shared among multiple people. However, my plan is not to discuss The Mandela effect, but rather how today’s information society may contribute to misrepresentations. How is it possible that so many of us share the incorrect representation of reality?
Quality?
As touched upon earlier, our attention and ability to digest information are limited, but our access and exposure to information are unlimited. Even the number of sources of information themselves are too great for us to cope with. In other words, in order to not drown in the sea of information, we need to filter the resources from which we tap our knowledge. To do this, one may think that the quality of information should be most important. But it turns out is that one of the most influential factors that determine our sources is accessibility. And which platforms of information are the most accessible? Yes, that’s right: Twitter, Facebook, Youtube, etc. Worldwide adoption, almost monopoly-like, makes these platforms extremely accessible. In addition, advanced technology — AI algorithms that are designed to accurately predict what content is best to serve which user — makes information from these sources super-easy to consume and digest. But as discussed earlier, since the ultimate goal is to attract our attention, the algorithms prioritize controversial, emotion triggering, extreme content instead of quality content[1].
You become what you consume
We all have a worldview — mental inner representations of the outer physical world we live in. Every single experience we have gone through, all the things heard, read, seen, felt — have made a contribution to the current state of our personal worldview. And it is with the help of these inner representations that we navigate the world every day. When advising a friend, expressing our opinions, even when considering which golf club is most suitable for our next strike — in everything we do our inner representations come into play.
Now consider how big of a chunk of people’s lives nowadays are being experienced online. We digesting the majority of our daily content online. Remember, content that tends to be served from the controversial menu. Per definition, controversial opinions are held by a minority of the population, so normally, one ought to encounter such content less frequently than opinions which is the general agreement. However, that will not be the case if most of the information we daily consume is from the controversial menu.
Imagine the following: You are walking across the public square, and right in the middle of the crowd someone with let us say, extreme political inclinations, is standing with a megaphone in their hand, shouting out their radical opinions. In such a situation one may easily figure out that these ideas are only shared by a small proportion of the population — since it is normally only a few people among the masses who are apparently supporting these views. In addition, as soon as you have passed by the public square you are no longer exposed to these controversial opinions.
In contrast, when you are online, your personal feed may continuously surround you with content and opinions of certain topics and alignments. It is then natural that you start to believe that a lot of people care for those topics and support those alignments since you are exposed to such a high proportion of that kind of content. However, in reality, it might be that only a small percentage of the whole population actually supports those views.
In a world, with personal filter bubbles of information, it is no longer easy to know what is real and not; what people actually believe or not; and to which extent people are believing something or not — because the distribution of the encountered content online does not match the real-world distributions. If our everyday content is misrepresentational of the world, it follows that our worldviews gradually become as misrepresentational.
In the physical world, the quacking extremists with megaphones are perceived as anomalies; whilst in the online world, they might be perceived as the norm because the algorithms prioritize the extreme. One could say that the online platforms themselves have become gigantic megaphones — for misrepresentations. Misrepresentation, however, causes no problems when we think Vader says “Luke, I am your father” instead of “No, I am your father”; but there is something deeply uncanny when people start to believe the earth is flat, vaccination causes autism, or that 5G leads to cancer.

So, What’s The Gist!?
We have today too many sources of information for us to handle which leads us to rely on the most accessible ones. However, many of those platforms tend to serve us content mostly from the extreme section of the menu. If the information we daily consume is skewed from reality, our representation of reality gradually becomes as skewed as the information we consume.
A very relevant post! It’s amazing how much of what we think is true, is in reality, an altered version.
LikeLike