Intellectual Honesty and Psychological Bias

If you’re like me, and you enjoy frequent conversations with people who disagree with you about, well, pretty much everything, you’ve probably wondered the following things about yourself and the person(s) you were conversing with:

1) “Who’s being intellectually honest here?  Who’s really willing to follow the evidence wherever it leads, no matter what?  Who’s willing to be objective and dispassionate about the issue at hand?”  (Maybe you’ve thought to yourself, “I better make sure that I at least appear more intellectually honest than so-and-so!”)

2) “I bet so-and-so really wants God to exist (or not).  It’s this psychological/emotional impulse that’s driving them to proselytize about their view, and reject everyone else’s.  It’s the only reason why they hold so stubbornly to their view.”

3) “I wonder when so-and-so would actually revise one of their beliefs in light of positive evidence that supports a competing theory.  What if damning rebutting/undercutting defeaters destroyed their own position?  I also wonder what it would take for me to change my position as well.  Where is the line?  What are the criteria?”

These are the kinds of questions that provided the impetus for this essay.  Let’s take them in order.

1 Intellectual Honesty

What exactly is intellectual honesty, anyway?  Does it mean dispassionately looking at the evidence of an argument in order to know that the conclusion(s) follow necessarily from the premises?  Does it mean changing your beliefs every time you hear an argument that contradicts the current beliefs you hold?  Do you suspend all judgment, indefinitely, while you continually study and wait for more evidence to come in, ad infinitum (thereby becoming a practical agnostic)?  Does it mean never having a stake in the outcome of an intellectual disagreement?  If you already know what makes a valid argument, and you think intellectually honest people need to acknowledge when they see one (even if they don’t like the conclusion), then skip to the section on soundness.  If you already know what soundness is, you still might be interested in that section, because it starts to bleed into cogency.  I’m also sure there are plenty of you that don’t want to read this article at all, but would rather go watch paint dry.  To this, I think you’ve probably made the right decision.

1.1 Validity in Argumentation

Let’s try to define intellectual honesty.  Minimally, it’s being able to acknowledge the validity of an argument regardless of what one’s opinion is regarding the conclusion.  For example, take a simple conditional argument that uses modus ponens (the mode of putting):

(1) If it’s Monday, then it’s the only day we eat pancakes for breakfast

(2) It’s Monday

(3) Therefore, we’ll be eating pancakes for breakfast!

It seems to me to be self evident that intellectual honesty demands of all rational people that they acknowledge that if (1) and (2) are true, then (3) follows necessarily.  That means it’s impossible for the premises to be true and the conclusion to be false.  This is the making of a valid argument.

You don’t have to like an argument’s conclusion to know that it’s a valid.  Let’s change the second premise so that the argument now looks as follows:

(1) If it’s Monday, then it’s the only day we eat pancakes for breakfast

(2’) It’s Tuesday

Then the conclusion would read:

(3’) Therefore, we’re not going to be eating pancakes for breakfast

We won’t be having pancakes for breakfast (since it’s not Monday), but the argument is still valid.  All it takes for an argument to be valid is that when you assume the premises to be true, the conclusion must follow from the premises.  Thus in our case, our revised little argument’s conclusion rightly followed the change in its key premise.  We’d be in trouble if our conclusion stated that we are eating pancakes even though it’s not Monday.  Then we’d then have an invalid argument.

Let’s take a more controversial example, like the moral argument for God’s existence:

(1) If objective moral values and duties exist, then God exists

(2) Objective moral values and duties exist

(3) Therefore, God exists

We can check to see if the argument is valid or not, independently of knowing whether or not the premises or the conclusion are actually true.  We can just assume them true, or false (it doesn’t matter), and see what happens.  If something funky happens when we do that, then we know the argument is invalid (hint: it’s valid.  The controversial bit is whether or not it’s sound).

Here’s an example of an invalid argument:

(1) My Christian friend John is a hypocrite

(2) Hypocrites have false belief systems

(3) Therefore, Christianity is false

This is an invalid argument that commits an informal fallacy called the ad hominem fallacy.  It’s when one claims an opponent’s position is false because of the opponent’s character, not because of independent arguments, reasons, or evidence that would count against the opponent’s position.  This one is really easy to slip into and is what turns rational arguments into childish quarrels.

Here’s one more:

(1) If it’s raining outside, then the streets are wet

(2) The streets are wet

(3) Therefore it’s raining outside

Can you spot the fallacy?  This is the fallacy of affirming the consequent.  The consequent, “then the streets are wet” could be made true by many other antecedents (antecedents are the premises that come before the “then” part of a conditional.  The antecedent in the above argument is “it’s raining outside” and the consequent is “the streets are wet”).  Maybe the street cleaner came by and made the streets wet or someone had a water balloon fight out there.  Perhaps the neighbor down the street is washing his car and the runoff is now making it to your part of the street, etc.[i]

1.2 Soundness in Argumentation

Well, enough about validity.  What about soundness?  What does it mean for an argument to be sound?  An argument is sound just in case it is valid and has all true premises.

Going back to the moral argument for God’s existence, the first and second premises have to be true in order for the argument to be sound.  If one of the premises is false, then the argument, while valid, is not sound (and therefore, a failure).  Thus, if one wants to destroy the truth of the moral argument, they must attack either premise (1) or (2).  They might try and offer an alternative foundation to anchor objective morality and thus compete with (1).  Or they might balk at the notion that objective moral values and duties exist, and thus reject (2).  Either way, if these objections are true, then the premises for the moral argument are false, and it’s unsound.

Thus far we’ve explored an argument’s validity and its relationship to intellectual honesty, but what about soundness and intellectual honesty?  Well, this is where it gets interesting.  What if someone thinks that an argument is sound, and yet still has trouble accepting the conclusion?  A few things might happen here:

1) They might reject the conclusion out of stubbornness

2) They might “have to think about it” before granting the truth of the conclusion

3) They might waive it off as unimportant or as simply “intellectual games” or “stimulating conversation” and then thank you for the fun

4) They might go to great lengths to avoid the conclusion by adopting positions that are rationally defensible, but don’t mix with that person’s character prior to the argument being presented (i.e. someone might reject the beginning of the universe in the cosmological argument in order to defend the universe as eternal.  However, in the past this person was known as a cosmology geek who knew that the Standard Model of the Big Bang was the most plausible position to hold in light of almost a century of confirming evidence).

I’m sure there are plenty of other scenarios one could cook up, but these will do for our purposes.  So what about the first one?  Is refusing to accept a true conclusion out of stubbornness intellectually dishonest?  Well, it might seem so.  But what if you tell this to your interlocutor?  What if you simply say, “I just don’t want to believe that”?  A statement like that seems pretty honest to me!  I’ve said those lines in an argument before.  So it seems that someone can be intellectually honest and simply refuse to believe the conclusion of an argument, as long as they make this known.  If they pretend to genuinely consider the conclusion even though they know, in the privacy of their own mind, that they’ll never believe, then it would be just to label them intellectually dishonest.

What about the second one?  If you need more time to think something over, is this a sign of intellectual dishonesty?  Not at all… probably.  This one is tricky, because it is legitimate to need time to mull things over.  But how long is a reasonable time to think something over?  If you know an argument is valid, and you only need time to mull over the truth of the premises to judge its soundness, then it seems as though this could be an indeterminate time frame.  The only person who is going to know for sure is the one mulling over the argument.

What if they say it’s really unimportant or something of the sort?  Well, this could be true.  There are plenty of people out there who live sensate lives and have no cognizance of the richness a robust intellectual life can bring.  It’s their choice if they can’t or refuse to see the value in asking life’s hardest questions.  It’s not my job to try and force people to be interested.  However, showing people how the answers to tough questions about life directly come to bear on their worldview could kindle a fire.

The last thing is tricky.  If you know someone personally and they flip-flop on a substantive philosophical thesis because of a threatening argument, you could have cause for concern.  The problem with trying to point this out is that it becomes a bald assertion on your part.  And competing assertions does not make a sound or valid argument.  In fact, competing assertions make no argument at all.

1.3 Cogency

We now need to consider cogency.  A cogent argument is one that is convincing, one that almost compels you to believe it.  I’m sure many examples come to mind when you think of this concept.  For me, a simple one would be the classic:

(1) All men are mortal

(2) Socrates is a man

(3) Therefore Socrates is mortal

The argument is obviously valid.  The premises seem self-evidently true, so the argument is sound.  And there’s a certain luminosity, or “shininess” to it, that makes it almost irresistible.  You can think up some of the more cogent arguments that you’ve heard in the past; however, I’m sure some arguments that you find cogent, others might not.  This means that there is a certain notion of subjectivity involved in cogency.

This subjectivity largely arises out of our background knowledge and beliefs that we bring to the table when we assess arguments.  When I, as a Christian, assess the moral argument for God’s existence, I bring to the table the belief that God exists, and – assuming that the argument is sound – find the argument extremely cogent.  However, atheists might see the argument, grant that it’s valid (some may even grant that it’s sound), and yet feel as though it lacks the “punch” required to push them over the edge into belief.  That is, they would say it lacks cogency.

Let’s remember why we jumped into a discussion about cogency.  In the last section I mentioned the possibility of needing some time to think an argument over before coming to a decision.  The amount of time one needs is directly related to whether or not they think the argument is cogent.  And whether or not they think the argument is cogent is largely going to be based on their background knowledge and beliefs that they bring with them to the table in assessing the arguments as well as one final thing: psychological bias.

2 Psychological Bias

I want the God of Christianity to exist.  I want Christianity to be true.  I think Christianity offers the most fulfilling answers to life’s questions, both emotional and intellectual.  I also think following Christ affords the most abundant and exciting life imaginable.  But wait, there’s more!  I want libertarian free-will to be true, I want Molinism to be true, and I want some kind of substance dualism to be true.  I want love to mean more than just dopamine popping off in my brain and I want sex to mean more than testosterone soaking it.  I could go on and on.

Some might be horrified after reading the above, because I’ve just “shown my hand”, as it were.  I’ve let everyone in on the big secret that I actually have an emotional stake in the issues at hand, and I want certain conclusions to be true and others to be false.  But what exactly follows from this?  If, in an argument, someone admitted that they wanted their position to be true, would you claim victory for yourself?  Would you mentally pat yourself on the back, congratulating your stalwart intellectual honesty (or at least the appearance of it)?  If you’ve done this in the past, then – and perhaps ironically for you – you’ve lost the intellectual honesty game.  Here’s why.

We’re all psychologically biased.  Some of us grow up wealthy, others poor, some educated, and others not.  We grow up in different countries under different political and religious systems, with different family environments and have different psychological make-ups.  All of this will lead to us wanting certain things to be true and other things to be false.  Since this is a universal condition of any human who has ever lived and ever will, I think it’s about time we all agreed it was OK to share our psychological biases with those with whom we disagree.

That’s probably scary for some people, and I suspect it’s because they either have a deficient view of intellectual honesty, or they are insecure in their ability to defend their own views in the face of criticism.  Fair enough.  But a part of being intellectually honest is being able to admit that you just simply don’t know certain things.  And that’s OK.  We’re all finite beings that lack omniscience.  Why should we pretend otherwise?  It is only the insecure person who has to fear a lack of knowledge.  And those willing to prey on them for such a display are just as insecure, maybe more so.

2.1 Psychological Bias, Cogency and Argumentation

How does psychological bias fit in with cogency?  It means simply this: as a human being, no one can compel you, that is, force you to believe anything.  All the evidence in the world will not tackle you and bind you.  There are skeptics of the external world, after all.  One can doubt their moral intuitions, their sense perception, even the ownership of their own mental states!  If we’ve learned anything from Descartes, the only thing we can be absolutely certain about is our own existence.  Psychological bias, then, can be used to destroy the cogency of any argument, regardless of its evidential merits.  One could be presented with all of the evidence that modern science affords us in ascertaining whether or not the earth is round, yet still reject that evidence and become a member of the Flat Earth Society.[ii]

Okay, enough of extremes.  The first thing to take away from this section is this: if your specific aim in debate is something other than getting to the truth, (i.e.  One may argue for a position for mere intellectual pleasure or to sharpen one’s debating skills, etc.) then intellectual honesty would suggest that you let others know this before you start.  To those weighing competing worldviews honestly seeking the truth, it’s OK to let your interlocutors know where you stand on the issues intellectually and psychologically.  “Wait a second,” you might say.  “If I let so-and-so know where I stand on the issues psychologically, I’ll lose my credibility with her!”  If this is the case, then you need to seriously consider whether or not they are worth conversing with.  If you still think so, then you might want to have a discussion about psychological biases first, and then proceed.

The second takeaway from this section is: one can be extremely psychologically biased and intellectually honest at the same time.  Psychological bias in no way whatsoever has to cloud our rational faculties.  I may be a Christian, and I may want Christianity to be true for many a-rational reasons (i.e. emotional and existential ones), but it simply doesn’t follow that I can’t tell what makes a sound deductive argument, or what triggers certain informal fallacies of logical inference.  For example, there are certain arguments for God’s existence that I just don’t find persuasive, and yet I believe that God exists.  I have friends who disagree with me, and think that those arguments are the best thing since Velcro. Se la vie.

It’s only when someone allows their psychological bias to so control them that they start making moves that are completely wild.  For example, they might suddenly “lose” their ability to tell a valid argument from an invalid one, and simply insist that their opponent’s argument is invalid.  The greatest temptation for these people is to slip into the ad hominem fallacy mentioned earlier (that is, they will attribute your worldview to your psychological disposition, or something else about your character, and then declare it false on those grounds alone).  Another tempting fallacy for these individuals is the genetic fallacy.  That is, they attribute your belief or worldview to the circumstances surrounding its inception.  America is predominantly Christian, and you grew up in America, so it’s no surprise you’re a Christian and therefore you have no good reason to think Christianity is true.  This miscarriage of reasoning has been popularized by Richard Dawkins and other New Atheists.[iii] Well, perhaps they’re right, perhaps the only reason that anyone believes in the Christian worldview is because that’s what they were taught as children (of course I don’t believe this, nor should anyone else, but let’s grant it for sake of argument to see how the fallacy plays out).  However, this has nothing to do with whether or not Christianity is true independently of whether we believe it is or not.  It’s fallacies like these that start to come out of the woodwork when you’re dealing with someone who can’t keep their psychological biases from seeping into their ability to weigh evidence via the canons of logic.  Avoid such people, or, if you care deeply about them, try to patiently show them that logic shows no favoritism.

3 Conclusion

Intellectual honesty is, minimally, being able to spot a valid argument even if one doesn’t like the conclusion.  It’s also about being able to be honest with ourselves in realizing that we all have beliefs that we want to be true and others we want to be false.  And that’s OK; we aren’t robots lacking emotion.  We have a huge personal stake in some of these arguments and their outcomes, especially when it’s at the level of a worldview.

Cogency is largely subjective and involves our background knowledge.  We all bring beliefs to the table when we discuss philosophical issues.  This knowledge and belief structure will influence how cogent we view the arguments offered to us.  This, in turn, is affected by our psychological biases.

Our psychological biases are OK as long as we don’t allow them to cloud our rational judgment.  One may want certain things to be true (i.e. for the moon to be made of cheesecake).  But one can also analyze sound arguments that contradict their desires.  Just because you want something to be true doesn’t mean that you then lack the ability to be objective about what constitutes a good argument.  It’s only when we hide our biases from ourselves and others that we can get into trouble.  Putting one’s cards on the table at the start of a discussion or debate is a good way of building intellectual credibility with peers.  It’s also a good way to remind ourselves to be intellectually honest in the pursuit of truth.  If one wishes to have conversations/debates with others while ignoring the pursuit of truth, they should make this known in case others are caught unaware.  Some may feel slighted if what they thought was a substantive conversation was simply a game.

We’ve sifted through some issues regarding intellectual honesty and psychological bias, but what then?  When is one expected to actually change their belief in light of these considerations?  This question will be the topic of my next entry on belief revision.

[i] There are plenty of formal fallacies like affirming the consequent, and many more informal fallacies like the ad hoc and ad hominem fallacies, etc.  If you’d like to learn more about these, a quick intro to logic can be found in J.P. Moreland and William Lane Craig’s Philosophical Foundations for a Christian Worldview ISBN 0830826947.  For a great treatment of Socratic logic with plenty of examples of the different types of fallacies, you may consult Peter Kreeft’s Socratic Logic (3e) ISBN 1587318075.  If you want something more advanced, you might want to consult Warren Goldfarb’s Deductive Logic 0872206602 (Goldfarb replaced Quine at Harvard).  A really modern text with lots of goodies is Daniel Bonevac’s Deduction (2e) ISBN 063122713X (Bonevac teaches at the University of Texas at Austin).  These last two are symbolic logic texts for hardcore philosophy.  If you want to be proficient in more dialectical/conversational logic that you’d use every day, then Peter Kreeft’s book is the best bet.

[ii] Yes, they’re real.  Google them.

[iii] What exactly is “new” about them?  Atheism has been around forever.  No one has been able to answer this for me.


Leave a reply