The other day a friend of mine watched an online documentary about the wet markets in Wuhan, and decided not to share it with others. She thought it might contribute to racism against Chinese people, because it suggested that their cultural practices were to blame for COVID-19. Although the documentary seemed objective, she said, why share something that can do no good but might cause harm?
Her attitude does not seem unreasonable to me. The problem is that it is very hard to see the subtle way that decisions like these add up, and lead not just to biased views of the world, but to polarization and division.
BiasMany people understand the word bias to mean something along the lines of prejudice. But bias simply means a tendency to go one way or another. Your music selection is biased, unless perhaps you listen to a completely randomized selection across all genres. A test is biased if it tends to over- or under-estimate whatever it is measuring. And information is biased if it includes the facts that tend to support one conclusion, while omitting facts that support another.
It’s impossible to be completely unbiased with the information we share. In fact, I’m not sure it’s even possible to define what “unbiased” information is exactly. But I think it is possible to be unbiased with respect to any particular issue. An attorney who tries to get their evidence admitted while attempting to suppress the evidence of the other side is biased. A judge who tries to allow all relevant evidence is unbiased.
When discussing beliefs that are important to us, we often act like lawyers, not judges. A bias towards our side seems natural, and even noble, since the other side is biased too. To cancel out the bias of their side, we need to make the strongest possible case for ours. Then hopefully the jury comes away with a complete and unbiased view of all the facts.
So we try to fight bias with bias.
BubblesBut like many things we do with the best of intentions, this has unintended consequences. We are generally not in the same situation as lawyers in a courtroom. The biased information we share in support of our point of view tends to reach a limited social circle that tends to share our point of view. These same people employ the same biases in deciding what information to share with us.
As a result we tend to reinforce each other’s biases, and ensure each other’s ignorance. Instead of being like opposing lawyers each presenting one side of a case to a jury, we are like lawyers for the same side sitting in a room reminding each other why we are right. None of us takes the trouble to discover what evidence opposing counsel has, for fear of changing each other’s minds, or appearing sympathetic to the other side.
I asked my friend if she would have shared the documentary about wet markets if the subject had been people in, say, the American South. Yes she said: in such a case greater awareness might actually do some good, by encouraging action to stop these practices. So what she has done is created a rule that says that, if the information might make people look bad, then 1) share it if those people are American 2) don’t share it if they are Chinese.
However well intentioned, this is a bias. And because of this bias, doubtless applied in countless other similar situations by like-minded friends, her social circle tends to be ignorant about negative things happening in China, and aware of negative things happening in the USA. They have inflated a filter bubble around themselves.
Meanwhile, the inverse is happening in conservative social circles.
DivisionPeople form their opinions based on what they know about the world. So groups of people with access to different information will tend to form different world-views. Eventually large world-view differences add up to the inability for each side to fathom how people on the other side could possibly see things differently.
In another recent example involving China vs. the USA, while the left was abuzz with talk about thousands of “kids in cages” on the US border, the right was abuzz with talk of the Uyghurs concentration camps in China. And both sides were shocked at the ignorance and indifference of the other side to such significant contemporary human drama.
More generally, liberals tend to promote social change but loathe bigotry. So their information tends to be biased towards facts critical of their own country, race, and culture, and against criticism of others. The tendency of conservatives to value patriotism and community has the opposite effect. Conservatives tend to love their neighbors and dislike strangers, while liberals tend to love strangers but dislike their neighbors, as it were.
Although everyone has heard of filter bubbles, few people believe that they are actually in one. They generally fail to recognize that their own inability to understand another person is evidence of their own ignorance, and not the other person’s inferiority.
So they explain the attitudes of the other tribe by inventing caricatures. The right condemns liberals for admiring oppressive foreign governments and “hating America”. The left condemns conservatives as a bunch of bigots supporting oppression in our own country.
Neither know what people on the other side really believe or why. Our information bubbles lead to ignorance not only about what is happening in the world, but ignorance about what other people know about what is happening in the world. It is meta-ignorance.
I think this explains much of the division and polarization of society. Millions of small, well-intentioned decisions to share or not share certain types of information inflate into filter bubbles. This leads to differences in information that leads to completely different beliefs, then to complete inability to understand how decent people could believe differently, then to resentment, fear, hatred, and division.