How to recognize when you’re in an echo chamber.
Being imprisoned in your comfort zone doesn’t sound unpleasant, but even a luxury prison holds you captive. Could you recognize if you were a victim of ideological imprisonment? How can we be confident we aren’t trapped in a comfortable social media bubble filled with like-minded opinions?
There have been many studies into how social networks are formed. One of the most interesting was the Social Ecology of Similarity which looked at different academic institutions in Kansas.
The University of Kansas is a large campus with a diverse population of 30,000 students from around the world. The study compared it to smaller institutions, most with around 1,000 students from less diverse backgrounds.
The researchers investigated the effects these different campus conditions had on people’s social relationships. Intuitively, the results we expect seem obvious. The University of Kansas has far more opportunity to meet a diverse range of people, and we expect this would be reflected in an increased diversity of friendships.
But the results showed the opposite. The study noted –
“When people have a choice, they choose relationships with people who are similar to them. Because people from the larger university will be able to choose among greater variety, they will also be able to match their interests and activities to partners more closely than individuals in the smaller colleges. This leads to a straightforward but ironic hypothesis: Greater human diversity within an environment will lead to less personal diversity within relationships.”
The result is paradoxical, more diversity in the larger network causes less diversity in the local networks. In the larger campus there are more potential friends. With a wider variety of people to choose from, it’s easier to fine-tune our relationships and find people who are more like us. At a smaller campus, it’s more difficult to find someone similar because there are less people to choose from.
And the same principle applies on the internet. We have access to a global diversity of opinions and ideas. But the diversity and size of the network has the paradoxical effect of producing less diversity in the opinions we are exposed to. The variety of choice means we can fine-tune our ideological sorting, limiting what we hear to like-minded views.
We get most of our information from social media and our preferred blogs and website subscriptions. We surround ourselves with palatable sources of information. And those social media feeds are powered by algorithms that present information based on what we want to see. Which means if we’re not mindful of the sources of our information and the filters we’ve imposed, we can end up in an ideological bubble of our own creation.
The bubble forms because our filters only let through information we find agreeable and pleasing. These information filters are not always a bad thing. There is a huge amount of information on the internet, we can’t sort through it all, we need to filter it somehow. How can we make sure we aren’t insulating ourselves from disagreeable information and taking that filter as further confirmation of our own opinions?
The philosopher C Thi Nguyen has made a useful distinction between information bubbles and echo chambers. In an information bubble people aren’t exposed to opposing views, they don’t hear them. This can have the effect of increasing confidence in the truth of their own view.
One of the most valuable tools of rationality is to find out if other people agree or disagree and why. If our reasoning is mistaken, others can point it out. If we don’t expose ourselves to those opposing views, our mistakes may go unnoticed.
However, these information bubbles are easily popped. All that needs to happen is to expose people to the views they haven’t heard. But echo chambers are something different.
Nguyen says,
“Echo chambers are a far more pernicious and robust phenomenon. Echo chambers isolate their members, not by stopping them hearing opposing voices, but by changing which voices they trust. They are isolated, not by selective exposure, but by changes in whom they accept as authorities, experts and trusted sources. They hear, but dismiss, outside voices.”
In their book Echo Chamber, Kathleen Hall Jamieson and Joseph Cappella study the logic of echo chambers. They examine political polarization by focusing on the American political commentator, Rush Limbaugh. Limbaugh doesn’t try to stop his listeners hearing opposing views, instead he attacks the integrity of anyone who holds opposing views. He makes constant attacks on the ‘mainstream media’. Outsiders aren’t simply wrong; they are malicious and manipulative. They are working to destroy Limbaugh.
The characteristic of attacking opposing views as untrustworthy imposes a different filter on echo chambers than information bubbles. An information bubble filters out opposing views so they aren’t heard; an echo chamber tells us opposing views can’t be trusted.
This feature of who we can trust distorts the way we form beliefs. It’s no longer as simple as exposing people to the opposing views, because exposure actually reinforces their views. The more the mainstream media says Limbaugh is wrong, the more his predictions of malice are confirmed. Once an echo chamber starts to grip a person, its mechanisms will reinforce themselves.
As Nguyen puts it:
“What’s happening is a kind of intellectual judo, in which the power and enthusiasm of contrary voices are turned against those contrary voices through a carefully rigged internal structure of belief. Limbaugh’s followers read — but do not accept — mainstream and liberal news sources.”
Everyone is dependent on outside opinions, it’s unrealistic to think we can investigate everything ourselves. We rely on the opinions of experts for almost everything. We need to trust doctors, lawyers, scientists and so many other experts because we can’t check all the facts ourselves. But if the opposing views are discredited as malicious, this will have the effect of increasing polarization and claims of fake news and conspiracy theories.
While we can’t avoid the numerous pitfalls in gaining knowledge and finding truth, we can protect ourselves by being aware of the traps. We should try and source our information from experts, hear opinions from both sides of the debate and focus on the reasons and evidence rather than the motivations of the people who hold those opinions. Their motivation has no bearing on the truth of their opinions.
And if we want to make sure we aren’t imprisoned in an ideological echo chamber, we can remember Nguyen’s basic check:
“Does a community’s belief system actively undermine the trustworthiness of any outsiders who don’t subscribe to its central dogmas? Then it’s probably an echo chamber”.