Pure rationality is a myth we should not aspire to

I think it would be very foolish not to take the irrational seriously. Jeanette Winterson

 

Be rational, people say as if –

  1. It’s (fully) possible
  2. The counterpart is unhealthy.

In reality –

  1. We all behave on a continuum from rational to irrational
  2. Those who put irrationality down are just as susceptible to it as those they criticise
  3. Knowing we are irrational will not stop us being irrational
  4. Rationality is not good or bad, nor is irrationality.

None of this absolves us from responsibility for our decisions or suggests we can’t improve awareness or emotional IQ. But it does challenge the idea that rationality is an endpoint or that rational thinking leads to rational or desirable behaviour.

Why the focus on rationality?

There are many reasons we elevate rationality, including –

  1. We’ve bought into the myth of the rational individual
  2. We worship data
  3. We believe that in having data, knowing is half the battle when it comes to understanding human behaviour.

While there’s some truth to these, there are also large gaps that need to be better understood.

  1. The rational individual 

The idea that we are individuals who know what we want and act to get it is so deeply ingrained that few of us stop to reflect that it is just that, an idea, rooted in philosophical concepts that emerged in the 1700s.

It assumes a person can know their own mind and makes choices based on a combination of biological drives and individual reflection.

However, it appears that –

  1. We do not know our own minds
  2. Thinking may be something that happens to us rather than something we do
  3. Our decisions are impacted by many variables including the people and situations around us, both those we know about and don’t.

Can we know our minds?

Yes, we can know our minds to some extent but we do not fully understand why we do what we do.

For example, the well-known Muller-Lyer illusion shows that even once we know the horizontal lines below are of equal length, we still see them as shorter and longer.

mueller-lyer

 

This is called amplitude or end-point bias and it’s only one of hundreds of cognitive biases or mental shortcuts that the brain uses to make judgements.

Most of the time, we don’t even know these biases are operating. They are useful because our brains have limited processing capacity and they help us filter useful information from the noise.

But they can also work against us leading to erroneous conclusions and questionable decisions.

One example is Gambler’s Fallacy – the belief that a previous event influences a future outcome. For example, if you flip heads on a coin ten times in a row it is not more likely that the next flip will be tails; the outcomes are statistically independent.

Likewise if we invest a lot of money in a project it’s hard to pull out even if there’s little chance of success. We think if we just put in a little bit more, we will get the result we want. This is called ‘sunk costs’ and a failure to understand it means we can land up throwing good money (or energy if it’s not a financial issue) after bad.

But biases can be adaptive because they lead to faster decision-making, important when timeliness is more valuable than accuracy.

In some cases, knowing about a bias can help us overcome us it and improve decision-making. For example our innate in-group bias means we like people like us. This preference helps us connect socially but it can cause us to dismiss or even fear people who are not like us. In recruitment this can result in teams that are very same-same. Introducing training and processes that stop us automatically hiring mini-mes can shift the culture and structure of a workplace.

But being aware of the bias, as in the illusion above, will not always help us override it.

Do we think, or are we thunk?

And it’s not just biases that we have to watch out for.

We assume that we’re in control of what we think but Professor Thomas Metzinger says cutting-edge research shows thinking is not so much something we do but ‘something that happens to us’.

Instead conscious thinking comes from sub personal processes including breathing (go yoga) and not at the personal, conscious level.

We believe people act mentally and rationally in a goal-directed manner but for more than two thirds of our lives we are not in control of our thoughts. Instead, 30-50% of the time we’re busy with task-unrelated thought.

There are many other processes that impact decision making including genetics, conditioning, personality, conformity and priming, to name a few. I want to mention them to point out how complex the influences are on what we do, without going into them here.

The many variables that influence decision-making

But of the greatest impacts on what we do and which deeply challenges the notion of a rational individual acting on their own mind is the world around us. The environment determines a large part of who we are.

Professor Alex Pentland says what we want and how we choose to get it, comes not from reflection but interactions with others and that what we want most is based on what peers agree is valuable.

This should not come as a surprise as we’ve understood environmental impacts for many years.

For example, the 1971 Stanford prison experiment allocated participants roles as prisoners and guards in a mock prison. During the experiment the guards got carried away in their roles and subjected prisoners to authoritarian measures like psychological torture, which the prisoners just accepted.

Like the Milgram experiment, in which ordinary people landing up giving dangerous electric shocks to others simply because they were ordered to do, the results showed the strong influence of external forces on action, although this may be less about conformity than the trust we place in authority figures.

And while these are important insights, it’s vital to remember that in both experiments there were participants who refused to do what they were told.

We are constantly responding to what people around us are doing. This makes sense; we are social animals constantly working out if others have friendly or hostile intentions. In a safe environment we are open and trusting, when things are unsafe, we get out of the way.

Many experiments have shown people will override obvious perceptions to draw stupid conclusions in order to feel part of the group. But modern revisions of these experiments are showing that acquiescence is neither immediate nor certain.

We often reference Asch’s experiments to show how easily people slip into groupthink. For example, confederates in an experiment agreed to call long sticks short and were able to convince a participant to go against his own belief that the short stick was genuinely short. What’s reported less often is that he also showed how easy it is to snap people out of groupthink, ironically perhaps given the subject of this piece, by behaving sensibly.

David Berreby also says it’s vital to put the idea that people are sheep to bed. We may follow in certain contexts but not blindly and are heavily influenced by the degree of trust we put in a person, including authority, hence the results in Milgram’s experiment.

There are obvious implications here for leaders around creating a culture in which true diversity (which includes diversity of thought and not just demographics) is encouraged and in which disagreement is not seen as disloyalty.

2. We worship data

We need to be circumspect about placing too much faith in scientific method and data when it comes to understanding people.

I don’t mean biologically. We can, for example, know with some degree of certainty how a drug will affect the body, although even here there are individual differences in how well a drug works – hence the rise of personalised medicine.

This is not the case with behaviour because much of what drives us is unknown.

That’s a problem when it comes to data because we need know what it is we want to measure, so we –

  1. Articulate a question
  2. Design an experiment to test it.

In other words, the idea comes first. Although the data we get can throw up further questions or ideas, the data is the outcome not the question.

Professor Gary Marcus says Big Data is great at identifying correlations not causation and that we should stop pretending it’s magic.

“All the data in the world by itself won’t tell you whether smoking causes lung cancer. To really understand the relation between smoking and cancer, you need to run experiments, and develop mechanistic understandings of things like carcinogens, oncogenes, and DNA replication.”

Even with data relating to simpler issues – like profitability – a good decision is not necessarily a data-driven one.

There is also increasing evidence that algorithms discriminate. Computer scientist Cynthia Dwork says the people who write software incorporate biases that are reflected in the algorithms created.

Data also can’t tell us the right thing to do. In the New York Times David Brooks discusses how a banker refused to pull out of Italy despite poor profit data because he was guided by thinking and values around what sits at the heart of businesses – trust.

While data can help us make sense of complexity it cannot tell us the right decision to make in any particular situation.

Likewise while data can give us trend insight or information about our customers, it cannot create the relationship with our customers.

Data can’t provide the answer to lung cancer or whether to close a bank, so let’s use it cautiously when trying to understand human behaviour.

3. Knowing is half the battle

Knowing is supposed to be half the battle, but according to Professors Laurie Santos and Tamar Gendler when it comes to cognitive work, knowing is only a ‘tiny portion’ of what leads to real world decisions.

They claim consciously accessible information is rarely the central factor controlling our behavior.

For example a bias like left digit bias means we see $19.99 as a better than $20.00 even though we know it’s not.

A bias has long ranging impacts; it can influence who we make friends with or who we employ.

Studies have shown that CVs submitted with the exact same skill set are perceived differently depending of whether the name is European or African, male or female. It’s not because people are deliberately racist or sexist, they are simply not aware of the bias. The workplace impacts for diversity are obvious but there can also be legal consequences like class actions.

Interestingly, gender discrimination in the sciences, often considered fact-based and objective, is just as strong.

In a controlled trial at Yale University female scientists were also automatically offered lower salaries and both male and female scientists were guilty of showing gender bias. When asked to explain their harsher assessments of women, those responsible were able to come up with rational arguments to explain the decision. The sexism was not deliberate but emerged from subtle, social stereotypes that had been reinforced over time.

Rationality is not implicitly good or bad, nor is irrationality

Examining scientific teams on work sites, Professor Kathryn Clancy found over 60% of respondents on work sites had been sexually harassed and 20% sexually assaulted. This was on top of psychological and physical abuse. The majority of perpetrators were senior scientists targeting junior females, mostly graduates. How can this be? Of all professions, scientists are trained in reason and objectivity. Surely their behaviour should reflect that too.

This raises some important questions –

  1. What’s the relationship between thinking and behavior?
  2. Was the decision to be abusive rational?
  3. Was the decision to be abusive irrational?
  4. Given the consequences does it matter whether the motive was rational or irrational, since the outcome was unacceptable?

A new word

While we’re not wholly rational that doesn’t mean unbridled irrationality is okay; but the methodical implementation of a rational and malicious plan can be evil. Associating rationality or irrationality with values is a non-sequitur.

There are domains in which knowing is not half the battle, but there are many in which knowing is half the battle, we should continue learning about what drives us, how to make better decisions, tackle stereotypes and create cultures that support healthy disagreement, real diversity and access to opportunity.

But the myth of rationality and its worship should be put to bed.

Dionne Lew



1 Comment

One pingback

  1. Let’s stop pretending we know everything

Leave a Reply

© 2017 A MarketPress.com Theme