+971 56 105 8825 letstalk@hardtalk.info

HardTalk™ Members

Common biases that affect HardTalk™ – A User’s Guide

by HardTalk™ Team | Member Open

The human brain is more powerful than any computer currently in existence as its capable
of managing 
1016 processes per second. This analogy, in itself, is problematic, but despite all that ‘powerour brains still have major limitations.   From forgetting where we put our keys to blanking on someones name, our brain doesnt always come through for us.  Worse still, is our tendency to allow the BrainDrains to work against us, leading us to make less than perfect decisions based on erroneous conclusions. 

There are literally hundreds of BrainDrain biases we could discuss, but here were going to focus on those that might, in some way, have an impact on our ability to have HardTalk

Before we start, it’s important to distinguish between cognitive biases and logical fallacies. A logical fallacy is an error in logical argumentation (e.g. ad hominem attacks, slippery slopes, circular arguments, appeal to force, etc.). 

Cognitive bias is the tendency to make irrational judgments in consistent patterns. It is a genuine deficiency or limitation in our thinking — a flaw in judgment that arises from errors of memory, social attribution, and miscalculations (such as statistical errors or a false sense of probability). Researchers have found that cognitive bias wreaks havoc by forcing people to make poor, irrational judgments often without actual intention:

A Queensland University study found that blonde women earned, on average, 7% higher salaries than redheads and brunettes.

A Duke study found that people with “mature” faces experienced more career success than those with “baby” faces. “Baby” faces were defined as those with small chins, wider cheeks, and bigger eyes. “Mature” faces were those with bigger chins, narrower facial features, and smaller eyes.

A Yale study found that female scientists were not only more likely to hire male scientists but they also paid them more than female scientists.

Its highly unlikely that the people in these studies actually wanted to pay blondes more money, enable people with mature faces to succeed at the expense of those with baby faces, or hire male scientists disproportionally and pay them more. And yet it happens. 

 Biases are believed to have played a role in the financial crises as bankers continued to pursue immediate gain, whilst ignoring  long-term risks and discounting information that didnt agree with their assumption. Biases affected several administrationsability to prepare and recover from natural disasters; people in Japan and New York overestimated the degree to which they could control the negative effects of a tsunami and a storm and underestimated what it would take to do so.

 All of these biases, and others, lead many great companies and institutions to make disastrous and dysfunctional decisions. But of course, companies and institutions dont make decisions – individuals and teams do. So why would we think bias doesnt affect us? Why would we think we are immune to the human brains tendency to look for and use patterns?

Some social psychologists believe our cognitive biases help us process information more efficiently, especially in dangerous situations.  So biases can be a help and a hindrance.  But can we do anything to avoid the ones that hurt?  Honest answer?  Not a lot.  There is very little evidence that educating people about biases does anything to reduce their influence. The problem is precisely that these biases are not conscious – we are literally unaware of them as they occur. Its very hard to just consciously “watch out for biases,” because there will never be anything to see. It would be like trying to “watch out” for how much haemoglobin youre producing. But don’t despair just yet. Collectively, groups and organizations can become aware of bias in ways that individuals cannot. Data can be collected, processes can be changed and individuals can use these and other tools to reduce our tendency to react to patterns. 

If you want to do better, or if you want to make sure HardTalk works for you and your organisation, these are the biases you should be looking out for most:

Confirmation Bias

We love to agree with people who agree with us. It’s why we only visit websites that express our political opinions, and why we mostly hang around people who hold similar views and tastes. We tend to be put off by individuals, groups and news sources that make us feel uncomfortable or insecure about our views — what the behavioural psychologist B. F. Skinner called cognitive
dissonance
.  It’s this preferential mode of behaviour that leads to the confirmation bias —
the often unconscious act of referencing only those perspectives that fuel our pre-existing views, while at the same time ignoring or dismissing opinions — no matter how valid — that threaten our world view.
“I trust only one news channel; it tells the truth about the political party I
despise.”
And paradoxically, the diverse internet has only made this tendency even worse; we are far more likely to stick to preferred sources than browse for new ones.   Consciously or unconsciously surrounding yourself with people that agree with you in the workplace can lead
to stagnation of ideas and productivity that will ultimately be detrimental to the business.

In-group Bias

Somewhat similar to the confirmation bias, the in-group bias, is a manifestation of our innate tribalistic tendencies. And strangely, much of this effect may have to do with oxytocin — the so-called “love molecule.” This neurotransmitter, while helping us to forge tighter bonds with people in our in-group, performs the exact opposite function for those on the outside — it makes us suspicious, fearful, and even disdainful of others. Ultimately, the in-group bias causes us to overestimate the abilities and value of our immediate group at the expense of people we don’t really know. Out-group bias leads us to perceive people who are different from us in a more negative light; ‘we cant trust him, look where he grew up.  

Gambler’s Fallacy Bias

It’s called a fallacy, but it’s more a glitch in our thinking. We tend to put a tremendous amount of weight on previous events, believing that they’ll somehow influence future outcomes. The classic
example is coin-tossing. After flipping heads, say, five consecutive times, our inclination is to predict an increase in likelihood that the next coin toss will be tails. But in reality, the odds are still 50/50. As statisticians say, the outcomes in different tosses are statistically independent and the
probability of any outcome is still 50%. If your client has been late with a payment 5 times in a row, you might think ‘the odds say they can
t possibly be late againor ‘that is an unavoidable pattern; either way, you are avoiding attempts to actually address the problem.

Relatedly, there’s also the positive expectation bias — which often fuels gambling addictions. This is the sense that our luck has to eventually change and that good fortune is on the way. It also contributes to the “hot hand” misconception that one person who has succeeded at a random event is more likely than others to repeat that success.  Of course, in reality, there is no such guarantee.  Waiting for a change in luck in place of taking action or  assuming that because things have worked out in the past they will do again is rarely a recipe for success.

Post-Action Rationalization

Remember that time you bought something totally unnecessary, faulty, or overly expensive, and then you rationalized the purchase to such an extent that you convinced yourself it was a great idea all along? Yeah, that’s post-purchase rationalization or post-action rationalization. Its a kind of built-in mechanism that makes us feel better after we make crappy decisions (like at the cash register or in a HardTalk scenario). Social psychologists say it stems from the principle of commitment, our psychological desire to stay consistent and avoid a state of cognitive dissonance.   When we are worried about a potentially bad decision in the workplace, it is a lot easier to find a way to justify it rather than rectify it, but in the long-run, overcoming this bias with HardTalk would be far more effective. Rationalising your less than role-model behaviour in a HardTalk by, for example, blaming the other person may feel good but having a conversation is more likely to get results.

Neglecting Probability

Very few of us have a problem getting into a car and going for a drive, but many of us experience great trepidation about stepping inside an airplane and flying at 35,000 feet. Flying, quite obviously, is a wholly unnatural and seemingly hazardous activity. Yet virtually all of us know and acknowledge the fact that the probability of dying in an auto accident is significantly greater than getting killed in a plane crash — but our brains won’t release us from this crystal clear logic (statistically, we have a 1 in 84 chance of dying in a vehicular accident, as compared to a 1 in 5,000 chance of dying in an plane crash [other sources indicate odds as high as 1 in 20,000]). It’s the same phenomenon that makes us worry about getting killed in an act of terrorism as opposed to something far more probable, like falling down the stairs or accidental poisoning.

This is what the social psychologist Cass Sunstein calls probability neglectour inability to properly grasp a proper sense of peril and risk — which often leads us to overstate the risks
of relatively harmless activities, while forcing us to overrate more dangerous ones. In HardTalk it stops us from speaking up when we could and, in fact, should. 

Status-Quo Bias

We humans tend to be apprehensive of change, which often leads us to make choices that guarantee that things remain the same, or change as little as possible. Needless to say, this has ramifications in everything from politics to economics. We like to stick to our routines, political parties, and our favourite meals at restaurants. Part of the perniciousness of this bias is the unwarranted assumption that another choice will be inferior or make things worse. The status-quo bias can be summed with the saying, “If it ain’t broke, don’t fix it” — an adage that fuels our conservative tendencies. The problem is that we get used to things very quickly – were adaptable – and so the status quo can quickly become something we live with even if we dont like it at first.
Sticking with a certain supplier, because that
s the way it has always been, could mean you miss out on savings or new opportunities. 

Negativity Bias

People tend to pay more attention to bad news — and it’s not just because we’re morbid. Social scientists theorize that it’s on account of our selective attention and that, given the choice, we perceive negative news as being more important or profound and more likely to be correct. Evolutionarily, heeding bad news may be more adaptive than ignoring good news (e.g. “saber tooth tigers suck” vs. “this berry tastes good”). Today, we run the risk of dwelling on negativity at the expense of genuinely good news. Steven Pinker, in his book The Better Angels of Our Nature: Why Violence Has Declined, argues that crime, violence, war, and other injustices are steadily declining, yet most people would argue that things are getting worse — a perfect example of the negativity bias at work. It also means that once we start to think ill of someone (because of the Fundamental Attribution Bias – see below) we then pay more attention to news that proves their badness, preventing room for our perception to change in potentially positive ways. 

Bandwagon Effect

Though we’re often unconscious of it, we love to go with the flow of the crowd. When the masses start to pick a winner or a favourite, that’s when our individualized brains start to shut down and enter into a kind of “groupthink” or hive-mind mentality. But it doesn’t have to be a large crowd or the whims of an entire nation; it can include small groups, like a family or even a small group of office co-workers. The bandwagon effect is what often causes behaviours, social norms, and memes to propagate among groups of individuals — regardless of the evidence or motives in support. This is why opinion polls are often maligned, as they can steer the perspectives of individuals accordingly. Much of this bias has to do with our built-in desire to fit in and conform, as famously demonstrated by the Asch Conformity Experiments. And it explains why we often avoid HardTalk when we need it the most. 

Projection Bias

As individuals trapped inside our own minds 24/7, it’s often difficult for us to project outside the bounds of our own consciousness and preferences. We tend to assume that most people think just like us — though there may be no justification for it. This cognitive shortcoming often leads to a related effect known as the false consensus bias where we tend to believe that people not only think like us, but that they also agree with us. It’s a bias where we overestimate how typical and normal we are, and assume that a consensus exists on matters when there may be none. Moreover, it can also create the effect where the members of a radical or fringe group assume that more people on the outside agree with them than is the case. Or the exaggerated confidence one has when predicting the winner of an election or sports match. It stops us being curious in HardTalk which makes listening difficult. 

The Current Moment Bias

We humans have a really hard time imagining ourselves in the future and altering our behaviours and expectations accordingly. Most of us would rather experience pleasure in the current moment, while leaving the pain for later. This is a bias that is of particular concern to economists (i.e. our unwillingness to not overspend and save money) and health practitioners. Indeed, a 1998
study showed
that, when making food choices for the coming week, 74% of participants chose fruit.  But when the food choice was for the current day, 70% chose chocolate. This explains why we are more inclined to put off difficult conversations again and again in order to avoid any associated pain.

Belief Bias

Deciding whether an argument is strong or weak on the basis of whether you agree with its conclusion. (“This logic cant be right; it would lead us to make that investment I dont like.”) This makes it even more difficult to ListenHard

Availability Bias

Making a decision based on the information that comes to mind most quickly, rather than on more objective evidence. (“Im not worried about heart disease, but I live in fear of shark attacks because I saw one on the news.”) This makes us think about the worst case scenario e.g. losing our job rather than the most likely e.g. momentarily upsetting our boss

Base Rate Fallacy

When judging how probable something is, ignoring the base rate (the overall rate of occurrence). (“I know that only a small percentage of start-ups succeed, but ours is a sure thing.”) Or, “I know Ive never seen anyone get fired for disagreeing with the boss politely but Im sure I will!”.

Egocentric Bias

Weighing information about yourself disproportionately in making judgments and decisions — for example, about communications strategy: (“Theres no need for a discussion of these legal issues; I understood them easily”) or about how to fix a problem “this would work for me so it would work for others”. This affects our ability to consider obstacles and FinishHard. 

False Consensus Effect

Overestimating the universality of your own beliefs, habits, and opinions. (“Of course I hate broccoli; doesnt everyone?”) and so reducing the need to ListenHard.

Framing Effect

Basing a judgment on whether a decision is presented as a gain or as a loss, rather than on objective criteria;“I hate this idea now that I see our competitors walking away from it.”

Fundamental Attribution Bias

Believing that your own errors or failures are due to external circumstances, but otherserrors are due to intrinsic factors like character or personality. (“I made a mistake because I was having a bad day; you made a mistake because youre not very smart.”) It is a tendency to attribute situational behaviour to a persons fixed personality. For example, people often attribute poor work performance to laziness when there are so many other possible explanations. It could be the individual in question is receiving projects they arent passionate about, their rocky home life is carrying over to their work life, or they’re burnt out. This bias stops us from being curious and also allows us to feel good about behaving badly because, after all, they’re bad people

Halo Effect

Letting someones positive qualities in one area influence overall perception of that individual. (“He may not know much about people, but hes a great engineer and a hard-working guy; lets put him in charge of the team.”) This can stop us from having HardTalk thats necessary with, for example, a friend or a star performer.

Illusion of Transparency

Overestimating the degree to which your mental state is accessible to others. (“Everyone in the room could see what I was thinking; I didnt have to say it.”) In HardTalk this can lead to us assuming that our behaviour is outweighed by the fact that we want things to go “well”. In reality, all the other person has to go on is our behaviour, so we need to WorkHard to manage it.

Loss Aversion

Making a risk-averse choice if the expected outcome is positive, but making a risk-seeking choice to avoid negative outcomes; “we have to take a chance and invest in this, or our competitors will beat us to it.”

Reactance

Reactance is our tendency to react to rules and regulations by exercising our freedom. A prevalent example of this is children with overbearing parents. Tell a teenager to do what you say because you told them so, and theyre very likely to start breaking your rules. Similarly, employees
who feel mistreated or “Big Brothered” by their employers are more likely to take longer breaks, extra sick days, or even steal from their company.

Sunk Costs Fallacy

Having a hard time giving up on something (a strategy, an employee, a process) after investing time, money, or training, even though the investment cant be recovered;” Im not shutting this project down; wed lose everything weve invested in it.”

Temporal Discounting

Placing less value on rewards as they move further into the future. “They made a great offer, but they cant pay me for five weeks, so Im going with someone else.” This can stop us from taking actions that may not fix our situation right now but would have a great impact in the future.

Wow,thats a lot!

According to Heidi Grant Halvorson and David Rock in strategy+business the first step is to identify the types of bias likely to be prevalent in organisations. They identify 150 or so known common biases into five categories, based on their underlying cognitive nature: similarity, expedience, experience, distance, and safety. (They have named this the SEEDSTM model.)

1) Similarity Biases: “People like me are better than others”

According to some research this tendency to break the world into in-group (people like me) and out-group (others) is so strong that just putting on different coloured tops makes it kick in with greater liking for fellow members of the same team and less liking of members of another team. In fact, when we see in-group faces, there is greater activity in several brain regions involved with emotions and decision making (the amygdala, orbitofrontal cortex, and striatum).

In other words its not that some people are “biased” and “bad” – this is just how our brains work. Our brains like patterns of success and want us to feel successful so, of course, we become the best pattern (definition of pattern) to follow.  This is why we tend to perceive that those who are most like us are better than those that are not.

This can be a very dangerous bias in the workplace.  We are more likely to hire in-group members — and once you hire them, were likely to give them bigger budgets, bigger raises, and more
promotions. It can affect many decisions involving people, including what clients to work with, what social networks to join, and what contractors to hire.  A purchasing manager might prefer
to buy from someone who grew up in his or her hometown, just because it “
feels safer.” A board might grant a key role to someone who most looks the part, versus someone who can do the best job. The bias is unfortunate because research (for example, by Katherine Phillips) has shown that teams and groups made up of people with varying backgrounds and perspectives are likely to make consistently better decisions and execute them more effectively. And its unfortunate in HardTalk because you might be more inclined to talk with people “like you” or to give them the benefit of the doubt because they are. 

2) Expedience BiasesIf it feels right, it must be true

These kinds of biases are the mental shortcuts that help us make quick and efficient decisions. Theyre what System 1 (see the HardTalk HandBook or Daniel Kahneman in Thinking, Fast and Slow) relies on to get the right “feel”. And System 1 is where most of us like to be – its easier and so more pleasurable.

Expedience biases tend to crop up in decisions that require just the opposite – coming to conclusions based on data, calculation, analysis and evaluation. We see it with doctors who diagnoses a tummy bug “because theyre been a lot of it about recently” and misses the signs that would have been obvious at a different time. Or with the consultant who has “seen this type of thing a hundred times” and so isnt listening attentively enough to find out what the client really needs. We also see this a lot with professional services people who are experts in their field and so often leave money on the table because they havent really listened. 

And, of course, expedience biases tend to be worse when people are stressed – our brains try to put as much on “autopilot” as possible. Its our job to move back to System 2. 

 3) Experience BiasesMy perceptions are accurate

Our brains have evolved to assume that what it perceives to be true is, in fact, true. We assume that what we see is the truth, the whole truth and nothing but the truth but, as we saw earlier, unconscious filters mean that we all interpret the truth differently.  This kind of bias is very hard to work against. The conviction your brain has created makes it difficult even when you logically know that your perceptions are just a result of your filters. 

The problem is that whenyou are absolutely sure you are right, then the other person must be wrong. Or stupid. Or crazy. Or dishonest. Or something else bad. And we dont need to care about bad people. We dont need to listen to bad people. So we dont need to have HardTalk.

   4) Distance Biases – Near is stronger than far

Research has shown that one network in the brain registers all kinds of proximity – not just space and time, but also conceptual proximity such as a project due this week versus a project due in a month. The closer an object, an individual, or an outcome is in space and time, the greater the value assigned to it and vice versa. Temporal discounting means that when we are offered USD100 today or USD150 in 3 months, we are very likely to choose the less but more immediate amount which, assuming no crazy levels, of inflation makes little sense as thats a guaranteed 50% return in 3 months – something youd imagine wed like! Distance bias also contributes to a tendency towards short-term thinking instead of long-term investment. It can lead us to neglect people or projects that arent in our own backyard — a particular problem when were in complex
organisations and situations and need everyone working across all silos and platforms and across geographies. 

5) Safety BiasesBad is stronger thangood

The fact that we react more strongly to losses than to gains is a safety bias. This loss aversion makes sense if we consider that we evolved this way – our great, great, great, great, great, etc grand-parents, and all the ones who came after them, were the ones who responded quickly to a threat by running away rather than the ones who thought “hmmm, interesting rattling noise
coming from under that stone. I
m sure itll be something delicious we can eat and Ill be a hero”. 

Safety biases can influence any decision about the probability of risk or return, or the allocation of resources including money, time, and people. These biases affect financial decisions, investment decisions, resource allocation, strategy development, or planning for strategy execution. Examples include not being able to let go of a business unit because of resources already invested in the project and not being willing to innovate in a new direction because it would compete with the companys existing business.

Safety biases can change our decisions if those decisions are about risk/return or the allocation of any resources. In other words, the work of leaders and many managers, not to mention empowered individual contributors. 

Crucially, they can alsos top us having HardTalk, as we focus on what could go wrong rather than what could go right. Perhaps youre thinking that this isnt true for you – that youre interested in winning – and youre right. We all want to win, but most of us – probably including you – are more influenced by wanting to avoid losing. This is why framing works. If we present an opportunity as a gain, we focus people (and ourselves) on the risks involved. However, if the same decision is framed as a way to avoid a loss people are more likely to ignore or justify the risk. Even if the information is the same. 

 One last thing….

Bias blind spot. If you begin to feel that youve mastered your biases, keep in mind that youre most likely experiencing the bias blind spot. This is the tendency to see biases in other
people but not in yourself.

 Awareness is key

Awareness is the best way to beat these biases, so pay careful attention to how they influence you. Take a look back at recent actions and decisions and try to determine if any of your biases contributed.  If youd like to learn more about HardTalk – the science and art of mastering the difficult conversations that make a difference to results  – for people and organisations, including how to manage our brains tendency towards bias you can go to www.hardtalk.info or watch the HardTalk video https://www.youtube.com/watch?v=aMcockHARI4 or get in touch (email link needed) 

References:

Some of the above is taken from George Dvorskys article, which can be found at: http://io9.com/5974468/the-most-common-cognitive-biases-that-prevent-you-from-being-rational) and work done by Heidi Grant Halvorson and David Rock  https://www.strategy-business.com/article/00345?gko=d11ee