Skip to main content

Beliefs and Evidence and Bias

Submitted by Joel on Wed, 11/21/2018 - 14:13

Recently, a friend of mine posted a story about believing without evidence and that it's immoral. I think that the overall concept of this is fine, but I think people lose perspective on why other people believe what they believe. Pretty much everyone believes that they are justified in their opinions, otherwise they would change them. So, while I agreed with the spirit of the story, I have issues with some of the specifics.

Just as a side note, here - this is not meant to be a scholarly article or philosophical rebuke of the original paper. I'm just giving my thoughts about this because it's very important right now to be more critical when we read things, but also be less arrogant when we disagree. So these are just my rambling thoughts on the subject, since I have been thinking about it.

The concept of the paper is that believing something without sufficient evidence is immoral in all circumstances. Well, this is really more that this is more and more true even though it was written 150 years ago. The new supporter is justifying thinking more critically, which I am a big fan of, but the author simply goes too far in justifying the more stringent stance of immorality. 

When justifying this idea of it being immoral to believe without sufficient evidence, three main points are given. The first is that we should believe responsibly, which is fair. The next is that we don't want to be gullible, which is a good point. Finally, we shouldn't dilute our library of knowledge with untrue things, which also seems fine.

So what's my issue?

My main issue is the idea of 'sufficient evidence'. This is a VERY subjective idea. The article gives good examples of important issues, like climate change and extremism, but when you consider that it suggests that EVERY belief must be justifiable with 'sufficient evidence', it becomes impractical.

Think of all of your beliefs. I'm not talking about some, 'I think therefore I am' level of justification. If we assume that we're not just brains in a jar or in some simulation, even with those assumptions, think of all the things we believe. When we eat, we believe that each bite is safe. When we drive, we believe our commute will be accident-free. When we vote, we trust (somewhat) that the person we're supporting will do what they say. Do we have evidence for those beliefs? Food is recalled, accidents happen, and politicians lie. So we KNOW we're taking a risk, so why is the evidence we have enough for us?

One specific example is gambling. One person might refuse to gamble because "the house always wins", so in the long run, you'll lose money, so why do it? This, of course, ignores the entertainment aspect of it. Could you apply that argument to watching movies? You always lose money at the theater. Sometimes people have different goals. Sometimes evidence, like loss statistics, matter more to one person than another.

Trust in the System

For most of these things, we trust in the system. Food is recalled because some organization is (hopefully) looking out for us. We drive all the time and rarely get in accidents. Politicians can be impeached or recalled or just voted out next election. But we also take precautions. We don't blindly trust, either. We wash our produce, we wear our seatbelts, and we demand checks and balances for our elected officials. 

These systems give us a trust in the system without the necessity for 'sufficient evidence'. We believe and risk our lives and our futures on these beliefs. But, you might say, we DO have evidence! Perhaps, you might argue, that we have evidence that food is safe 99.9% of the time, so we prepare for the .1%, but trust in the 99.9%. But isn't that a justification? You could grow your own food. You could wash it in water you prepared. You could get much closer to 100%. But we don't. Is that justified?

I think it is. For practicality's sake, we need to create heuristics or biases in order to be efficient in life. These biases causes us to have different levels of requirements for evidence for our beliefs. Enough people buy groceries and drive cars and vote for representatives, that we trust that if there's a flaw, there's shared danger and responsibility. We do this all the time. We believe in all sorts of things without a scientific-level degree of evidence. 

The Scientific Method

But what about science? That is evidence that is irrefutable, isn't it? HARDLY. Yes, the scientific method is the best way we have for determining what is closest to the truth, but it's FAR from perfect. Scientific theory is constantly evolving, which is a really good thing! If you think back to some of the smartest contributors to science over the centuries, imagine how many things they believed have now been shown to be completely wrong. Newton was brilliant and foundational, but we are constantly updating his theories. He had justification for his beliefs, and yet, he was so wrong about so many things. 

When we expand this concept from science to something closer to us, we trust without evidence all the time. We trust in love and the social contract and all sorts of other things. I think love is a great example, personally. In my love life, I have dated a dozen or more ladies very seriously. But I only married one of them. Most of them disappointed me greatly and hurt me in a way that will never completely heal. Yet, without evidence, I tried, tried again. 

We Are All Biased

My point with all of this, is to say that we all have biases in our opinions. All of us. Many scientists, for instance, believe that the simplest explanation is the most likely. While this serves most people very well, it can be wrong. We trust in these biases and this trust makes us who we are. We all have these assumptions, like "we're not brains in a jar living in a simulation" which is important for us to continue.

But there are more biases, like, when a source that I trust tells me something that coincides with my opinions, I believe it. This is a big one that we really need to be careful about. There is a lot of pressure in our society today to release a story that might not be justified. Critical thinking is very important, as is skepticism. But can two people use similar approaches to deciding if a source can be trusted and end up with different results? Of course. And we need to understand that sometimes people believe things using different metrics to measure the evidence, and that doesn't mean they are evil or stupid. And just because someone agrees with us doesn't mean they AREN'T evil or stupid. 

My Final Thought

So what's my overall message? Look for sources that you can trust by checking their credentials. Don't give your trust easily. Verify. And if they betray your trust, be wary of them. Think critically. But remember, just because you come to a different conclusion than someone else doesn't mean that they are a bad person. Listen to how they came to their conclusion. Let it test yours. But most of all, give respect to people that listen to you and want to hear how you came to YOUR conclusion. Just because you disagree doesn't mean you can't discuss. The goal should be to learn. Even if it's just learning another perspective. 

I could go on and on about each of these topics, and I'm guessing that I will in the comments. Please, be kind. And let's discuss!