Aside from user abuse and discrimination, misinformation on social media is among the greatest concerns for most platforms right now. From election interference, anti-vaccination propaganda, and threats on social media leasing to physical violence, the consequences of misinformation are real and are becoming more widespread and severe each day.
While people have different opinions on various topics, the exposure to misinformation and erosion of trust in media have led to people forming beliefs that appear to strengthen in conviction when argued against, rather than changing their opinions in light of new information. Believing in climate change is not the same as believing in the tooth fairy, yet when some topics enter public debate somehow a person’s assumptions, beliefs, and personal experience are projected to discredit or contradict scientific evidence and robust data sets. Observations have to be tested for validity and consistency so they are not perpetuating a type I error.
Facebook’s approach to addressing misinformation is to not remove the content, but to counter it with factual reports or opposing views so each person’s timeline is balanced. Though it makes sense, the practice has not been successful. The problem with social media is that the content is generally isolated to a person’s existing network and there is a higher likelihood that you share ideologies with your friends and family. There isn’t a lot of opportunity for new information to enter your network. If that isn’t the case, posts still have to compete for attention based on your behaviour, meaning that you will see content similar to the posts you’ve previously interacted with. The platform’s algorithms will curate and incentivize content based on your preferences and interactions. With these factors, most of us are passively consuming content in an echo chamber rather than seeking various perspectives with a purpose.
What Facebook may have also overlooked in its strategy is how people perceive content and information. Even when presented with opposing views or balanced content, people tend to prefer what they are familiar with. Even when something is true it may not be as believable for a number of reasons including cognitive dissonance, fallacy of false authority, or following herd like behaviour. The design and presentation of misinformation may not have been fully considered. A key feature in the spread and adoption of propaganda is that it is often designed to be more persuasive than scientific or reliable media, often achieved by limiting word counts for shorter attention spans and writing at lower reading level for easier consumption.
The larger issue is to understand how people came to embrace fringe views in the first place and to work toward a system that encourages people to not only seek out new information, but to have the ability to change their views because of it.
Fringe theories, sometimes synonymous with pseudo-scholarship, tend to be misrepresented as being equal to mainstream theories. Of course creative ideas are what inspire innovation and discovery, however, they are subject to intense scrutiny and review by experts in the field. When they are rejected and people continue to rationalize their beliefs though, the result can lead to the blurring of fact and opinion.
This issue then, may reflect a greater social political trend known as the Overton Window or window of discourse. The idea is that there is a spectrum of acceptable freedoms held by the public and each idea falls somewhere between unthinkable and public policy. For those wanting to exploit the window, introducing and reinforcing a radical idea may desensitize people to it and eventually the most acceptable version of the radical idea will be adopted. By first introducing a more extreme idea and appearing to compromise, your next proposal now seems more acceptable by comparison and will likely be tolerated or accepted.
In addressing misinformation we must consider how actions today will impact the future. While not everyone will be a citizen analyst, we can encourage people to think logically and to consider and maintain one another’s dignity when engaging in public debate. Individuals have access to information, but they are unlikely to encounter and interact with it on social media if they are passively consuming content curated by algorithms.