How to avoid cognitive biases when you get paid to think

One of the major findings in last 50 years has been what people had suspected all along: human thinking and judgment often isn’t rational. By this, I mean given a situation where someone has to make a decision, she will often take a decision that “leaps” to her immediately rather taking than a decision that incorporates the structure of the problem, data available immediately and the data that should be collected. In many cases, intuition and reasoning arrive at the same decision so it isn’t an issue. But, stating the obvious, in many other cases, intuition leads to a worse decision in retrospect. How often have you said to yourself: why did I NOT think of it earlier?

This flaw in our decision making is worrisome because our intuition can be “hacked” by others to make us behave in ways that aren’t in our favor. This motivated framing and packaging of information happens all the time in the real world. From Tinder to LinkedIn profiles to what’s written on yogurt. What’s left out is often more important than what’s communicated. As a quick illustration, if there were two yogurts brands on the shelves: one says “80% fat-free” and the other one says “20% fat added”, which one will you pick? When a sales manager proposes discount at the month end to boost sales and tells you that “revenue today is better than revenue tomorrow”, do second and third order consequences of discounts on your company’s reputation jump to your mind? What if the sales manager had framed his proposal as: “giving discounts will destroy our reputation as a company that’s confident about its products, but we will absolutely increase revenue. Are we ok with that?”.

Both “B” and “13” is the same shape. Other people can “hack” your decisions by presenting it to you in a certain package

Such framing effects are extremely common in daily life and business. The issue isn’t that intuition is bad. It serves us extremely well in everyday situations. When the stakes are low, you wouldn’t want to “reason” which ice cream to eat (I do that, but then I’m finicky about icecreams). My concern is that we only know stakes in retrospect. Oh-shit moments happen when we use our intuition on seemingly low stakes that turn out to be consequential.

This is the dangerous Catch-22 situation: to decide to reason carefully, you often need to know the stakes (is it worth so much “thinking”) but you only get a good estimate of stakes when you reason (our intuitive estimate for stakes could be wrong and we wouldn’t even know). It’s when a rider thinks that because it’s a short ride to the office, not wearing a helmet is OK.

As the CEO of a 200+ people company (Wingify), this topic is important to me because a seemingly inconsequential decision that I make through intuition could have major second-order effects that didn’t occur to me. With each such mistake, my intuition gets better (so does yours) but I would prefer to minimize such mistakes. What if a second order effect becomes an existential threat to the company? As an example, once I made a seemingly inconsequential decision of removing sugary drinks (Coke and Pepsi) from the company’s canteen. To my surprise, people took it to be a cost-cutting measure rather than health-related reason. I thought I was taking a simple decision that was in the benefit of the team. But it never occurred to me that it the same action could lead to multiple different (and all valid) interpretations.

Now I know that “experience” in business is an accumulation of mistakes that a person has used to expand his/her “circle of available information”. With reflection on mistakes, what leaps to the mind could be readily expanded, so experienced people make better decisions. Without reflection, you’ll repeat the mistakes. Each mistake is an opportunity to train the intuition. (This is one of the reasons I started Inverted Passion. Writing forces me to reflect on my experiences.)

Avoiding cognitive biases

Kahneman (author of the bestseller book Thinking Fast and Slow) is the de facto name that leaps to the mind when talking about cognitive biases. (I cited his paper on Prospect Theory in a previous article on startup failures). In another paper titled: “Maps of Bounded Rationality: Psychology for Behavioral Economics” he presents research-backed methods of avoiding cognitive biases. (The paper is extensive and contains many excellent ideas beyond avoiding cognitive biases. I highly recommend reading it).

Evidence from research on avoiding cognitive biases

Hack #1: Avoid making decisions under time pressure

Our ability to detect and correct an error in judgment significantly worsens when we’re deciding under pressure. Only in very rare cases, there isn’t a choice to take more time to decide. Even in a road accident, it would be wise to think for a while before reacting. At work, this limitation of time on decision making is usually self-inflicted. Vacuous statements such as “we must have a decision by the end of this 30-minute meeting” or “speed is all that matters” actively works against good decision making. I know these statements are vacuous because I used to use them, until I started reflecting and asking why a meeting should end with a decision when none is apparent. There’s no wisdom in taking a fast decision that proves to be inaccurate.

In his 2016 shareholders letter, Jeff Bezos had a contrary advice to what I mention:

Some decisions are consequential and irreversible or nearly irreversible – one-way doors – and these decisions must be made methodically, carefully, slowly, with great deliberation and consultation. If you walk through and don’t like what you see on the other side, you can’t get back to where you were before. We can call these Type 1 decisions. But most decisions aren’t like that – they are changeable, reversible – they’re two-way doors. If you’ve made a suboptimal Type 2 decision, you don’t have to live with the consequences for that long. You can reopen the door and go back through. Type 2 decisions can and should be made quickly by high judgment individuals or small groups.

Of course, this works for him. He has assimilated tens and hundreds of mental models into his intuition over last 20+ years of running Amazon. So what leaps to him intuitively will be very different from (and probably a superset of) what leaps to you. (A meta-lesson here is that always think in which contexts a business advice will fail. Yes, the irony of this advice isn’t lost on me.)

At work, because consequences of your decisions are unknown upfront, a good “hack” to know when not to take decisions under time pressure is when you are in a role that impacts many other humans. A CEO’s action has multiplicative effect on employees, customers, and shareholders. A software engineer’s action impacts all customers (security loopholes are usually introduced by seemingly innocent changes). A salesperson, on the other hand, can do limited damage (if you ensure contracts are vetted by a paranoid lawyer).

Hack #2: Avoid making decisions when you are cognitively involved in a different task

Research shows that performance on puzzle style tasks reduces if participants are asked to keep some numbers in their head while they’re solving the puzzle. This problem compounds at work because people who have the highest capacity to cause a damage through a bad decision (CEOs, VPs, top managers) are usually the ones that get pulled into most meetings. If you have ever been in a meeting with an agenda that matters to you (and not because your boss asked you to be there), you know how mentally taxing they could be. Inevitably, the hangover of an unresolved problem from one meeting goes with you to another meeting and you do worse in both meetings.

For anyone who earns their living applying brains to problems, cognitive bandwidth becomes the limiting factor for value creation. What matters isn’t how many emails you reply or how many lines of code you write. What matters instead is the quality of your thinking and decisions.

A hack that I use at work is blocking a chunk of time for myself in the calendar to just think. I don’t check emails or attend any meetings during that time. Paul Graham calls this maker hours and it has worked wonderfully for me (I was inspired by Cal Newport’s Deep Work to adopt this technique). On top of this, I also actively try to just attend only one important meeting per day. I’m not a fan of jam-packed schedule because I know I’ll not be able to add the value that’s expected of me after the first meeting (and nobody gives me brownie points for just showing up and doing nothing in a meeting).

Hack #3: Don’t make decisions in the evening if you are a “morning person” (and vice versa)

Different people feel fresh at different times of the day. Some are night owls, but I’m a morning person and know that I can’t function effectively in the evening. Know when do you feel most productive and block that time for yourself. This is why my makers hours are in the morning from 10:30am to 2pm and avoid doing work after 7pm. One of the rookie mistakes I did as a young CEO was to schedule day-long meetings with an extensive agenda. It was a big mistake because all the brainstorming happened in the morning when I (and many other people) felt energetic but when it came to making decisions, it would be evening and I wouldn’t realize that this meant taking decisions that came from a tired mind (and they’re not necessarily good ones). If only had I questioned good-ol-advice on meetings and understood when it works and when it doesn’t.

Hack #4: Watch out if you are happy about a decision

This one is counter-intuitive but research indicates that happy people take worse decisions. The reason for this isn’t that happy people have reduced cognitive capacity, but that they are not able to detect errors in their intuition (and because they’re not able to detect such errors, their cognition doesn’t get a chance to correct such errors). This error occurs simply because we avoid sadness and in decisions where the success of the decision is important to us, we latch on to any positive evidence that leaps to us (while the negative evidence doesn’t even get registered). Entrepreneurs need to be careful of this because they so desperately want to succeed is that they overinterpret the evidence in their favor. If you send an email to 100 people and one responds positively, your attention automatically focuses on what good things the respondent says and not on the fact that 99 people chose to not reply. The absence of data is a data point to interpret.



Being cynical about the decisions that are obvious helps us search for ways something could go wrong (how it is successful comes automatically if we’re vested in the decision). It’s an extremely difficult hack to apply because it is our ego that gets attacked by this style of thinking. What helps me is to list down everything about a decision that’s making me happy and then actively force myself to write all the reasons the same thing could fail and make me unhappy. (Just thinking doesn’t cut it as I start justifying why failures wouldn’t happen, for me, I have to write). After writing the pros and cons, I don’t act immediately but rather wait to cognitively get involved with something else. When I revisit, I see the same list from a fresh pair of eyes and am able to interpret more objectively than before (like a non-interested party).

This hack also works for new ideas that I come across. If I get excited about an idea, I Google ” vs ” or ” criticism”, and Google autocompletes with opposite perspectives that my excited-self wouldn’t get to know. (See for example, “blockchain criticism” or “universal basic income criticism“).

Watch out for your happiness, especially when you want something to desperately succeed.. Who said thinking better is easy, anyway.

Hack #5: Train yourself to think statistically

Statistical thinking is taxing, so it doesn’t come to us intuitively. Humans overweight low and high probabilities, and completely ignore very low and very high probabilities. Even doctors fall prey to non-statistical thinking. Consider this simple situation (that can very well happen in the real world).

Approximately 1% of women aged 40-50 have breast cancer. 80% of mammograms detect breast cancer when it is there (and therefore 20% miss it). 9.6% of mammograms detect breast cancer when it’s not there (and therefore 90.4% correctly return a negative result).

You take a mammogram and it comes out as positive. What is the probability that you have cancer?

Before reading on, say out loud what’s your gut feeling. Thought? Now close your eyes and reason for a bit, and then decide. What’s your answer now?

The average answer for physicians who are responsible for guiding patients is close to 75%. What’s your answer? (The actual answer in this case to how likely that you have cancer if the test comes out positive is less than 10%. Surprised?).

I recommend getting familiar with Bayes style of thinking as such thinking will help you take better decisions in a variety of situations:

  • Sales of a product increasing, should you hire more salespeople? (Here, knowledge of fluctuations and difference between correlation and causation helps)
  • The customer satisfaction score average came out to be 80%. Is that a cause for celebration? (Here, what helps is the knowledge that averages destroy information about the distribution)
  • 90% of people who start a business fail within 2-3 years. Should you take the risk? (Here, what helps is taking 90% as a Bayesian prior but then incorporating evidence about factors for failure and then deciding what factors work for you and what factors against you.)

How do YOU avoid cognitive biases?

I’d love to know your hacks for avoiding cognitive biases.

Tweet your response to me as a reply to this thread and I’ll retweet the most intriguing responses. In the same thread, you can also check out and comment on what others proposed.

Join 150k+ followers