“Thinking, Fast and Slow” at Work

Thinking Fast and Slow Image

Not all workplace performance issues have to do with human motivation, behavior, thinking, and decision making, but plenty of them do.

As a result, if you’re in any way interested in workplace performance, it’s helpful to know more about what motivates people (see this article on workers and motivation), how people behave, how they think, and how they make the decisions they do. This is true if you’re in HR, it’s true if you’re in learning and development, it’s true if you’re in operations, it’s true if you’re in health and safety–it’s true no matter what you do at work.

And that’s why it’s helpful to study fields concerned with human thought, behavior, and decisions in addition to what you may think of as your core field. Psychology, sure, but even something like anthropology can be very helpful.

And that’s also why we’re interested in behavioral economics. What is behavioral economics, you ask? It’s a blending of economics and psychology that considers why people make the decisions they make (which are often not in their best interests). You may have caught our earlier article discussing Dan Ariely’s book The Upside of Irrationality, or perhaps you caught our more recent article based on a book by the folks at Freakonomics. These are both works of behavioral economics.

But even as popular as something like Freakonomics is, it’s perhaps true that the true big kahuna, the real grand poobah of behavioral economics, is Daniel Kahneman. He won the Nobel Prize in Economic Sciences, after all.

And in this article, we’re going to take a quick look at Kahneman’s classic book Thinking, Fast and Slow to give you some insights from that book into why people think what they do and why they make the decisions they make so you can apply those insights to help you create a more productive, efficient workplace.

Two Systems for Thinking: One Fast, One Slow

Kahneman says we can think of the mind as if it has two separate systems (or modes) of thinking, which he calls System 1 and System 2.

Here’s how he explains the two systems:

System 1 operates automatically and quickly, with little or no effort and no sense of voluntary control.

System 2 allocates attention to the effortful mental activities that demand it, including complex computations. The operations of System 2 are often associated with the subjective experience of agency, choice, and concentration.

Source: Kahneman, Thinking, Fast and Slow, Chapter 1, “The Characters of the Story”

He goes on to say that we like to believe that we primarily think in a System 2 way–we’re rational, we reason, and we use those abilities to make wise, informed decisions that are under our own control and are in our best interests.

But even though that’s a nice story, one we love to tell ourselves, it’s not really all that true.

System 1: The Fast System for Thinking

The fast system for thinking, System 1, is essentially automatic and requires you to put in little or no effort. System 1 includes skills we’re born with, such as perceiving the world and the ability to recognize objects around us, as well as skills we learn that become automated due to repetition and practice (such as completing the phrases “peanut butter and _____” or “2 plus 2 = _____.”

There are a lot of benefits to System 1 thinking. It allows us to react quickly in certain situations–running away when we see a large predator walking through tall grass, for example. System 1 is also useful because it doesn’t require our attention and effort–and that is often a good thing, since our attention is limited and handling mental tasks with System 1 reduces the risks of overwhelming our working memory and experiencing cognitive overload.

System 1 often uses heuristics (which Kahneman defines as “roughly, a rule of thumb”) to make decisions so quickly without requiring attention and effort. And in many cases, these heuristics lead to good decisions.

The problem is that these heuristics can also lead to mistakes. In many of these cases, the application of heuristics leads to the application of cognitive biases. As a result of cognitive biases, we make inaccurate and/or wrong decisions, and we’re often unaware of our errors. An example that Kahneman gives is that when faced with a decision such as electing a president, instead of thoughtfully analyzing the candidate’s positions, we might quickly default to a snap judgement and make a decision based on how the candidate looks.

System 2: The Slow System for Thinking

Kahneman describes the slow system for thinking, System 2, this way:

System 2 allocates attention to the effortful mental activities that demand it, including complex computations. The operations of System 2 are often associated with the subjective experience of agency, choice, and concentration.

System 2 is the thought process that requires attention, concentration, reflection, and hard work.


Our Lazy Brains

Kahneman goes on to explain that brains are essentially “lazy,” and that the brain’s default mode is to apply System 1 decision making.

There are many benefits to using System 1 as the default thinking and decision-making process. For example, because System 2 requires attention and concentration, using System 2 prevents us from paying attention to other things and makes us oblivious and even vulnerable.

However, using System 1 as the default isn’t always a great thing (especially since we don’t realize we do it and since we tend to live under the fiction of being highly reasoned, reflective, critical, analytical thinkers). System 1 often causes us to slip into cognitive biases, meaning we make poor decisions (but make them quickly).

Here’s how Kahneman puts it:

The division of labor between System 1 and System 2 is highly efficient: it minimizes effort and optimizes performance. The arrangement works well most of the time because System 1 is generally very good at what it does: its models of familiar situations are accurate, its short-term predictions are usually accurate as well, and its initial reactions to challenges are swift and generally appropriate. System 1 has biases, however, systematic errors that it is prone to make in specific circumstances. As we shall see, it sometimes answers easier questions than the one it was asked, and it has little understanding of logic and statistics. One further limit of System 1 is that it cannot be turned off. If you are shown a word on the screen in a language you know, you will read it–unless your attention is totally focused elsewhere.

Applying Lessons from Thinking, Fast and Slow at Work

It’s helpful to be aware of these lessons from Kahneman’s Thinking, Fast and Slow because at times we can avoid mistakes we’d otherwise make.

For example, let’s review the paragraph-length quote immediately above that ends the previous section, and in particular when he says:

  • “System 1 has biases, however, systematic errors that it is prone to make in specific circumstances.” This could apply in many work scenarios, but let’s consider a safety manager performing an incident investigation. While performing the investigation, and while feeling very reasoned and analytical, the safety professional may still fall prey to any number of common cognitive biases. Hindsight bias, for example, is one reason why safety professionals so often determine the root-cause of a safety incident was “human error” and/or involves a lack of situational awareness. Quick hint: We talk about this issue more in our Introduction to New Safety recorded webinar and give a possible alternative in our article on using learning teams for safety incident investigations.
  • “…it [System 1] sometimes answers easier questions than the one it was asked…” This could apply to a learning professional who’s been tasked with determining the effectiveness of a training program. The learning professional might attempt to do this simply by determining how many employees attended a training session and/or what percentage of employees completed the training session. But of course, this “butts-in-seats” training data doesn’t tell us if employees learned from the training, if they developed skills, and/or if the training truly helped move the organization toward a business goal. Likewise, a safety professional may struggle to determine a meaningful measure for safety in an organization–you’re probably familiar with the controversy about measuring safety by using incident rates.
  • “…it [System 1} has little understanding of logical and statistics.” I’ll leave you to think of situations at work when people didn’t use logic while attempting to solve a problem or create a plan. That said, I’ll give you a tip: it’s easier to spot in others, but you’re probably doing it, too (and they can spot it in you). That’s why group discussions in which ideas are exchanged and you get exposed to different views can be meaningful. As for System 1’s difficulty with statistics, in the upcoming era of big data and data analytics, we’re all going to need to be aware of this problem and become better, more careful, more logical, and more reflective consumers of data and analytics.

Making the Best of System 1 With a Nudge

Many times at work, it’s best to build in opportunities that force us to be more reflective and analytical, moving us from System 1 to System 2.

However, we can’t spend our whole lives in System 2. It is too hard and it takes too much time.

So in some cases, you can take advantage of something like nudge theory to harness System 1 for the forces of good. This slideshare about using nudge theory to go beyond training for performance improvement at work by Arun Pradhan has some great ideas along those lines.

Conclusion: Put Lessons from Thinking, Fast and Slow to Work at Your Work

I hope you’ve found this introduction to Daniel Kahneman’s classic book Thinking, Fast and Slow to be interesting. If you did, I definitely recommend you read the full book.

Please use the comments section below to post your own thoughts or to ask any questions you may have.

And don’t forget to download our free PDCA Cycle infographic for continuous improvement. 

pdca Button

FREE PDCA Cycle Infographic

Download this free infographic of the Plan-Do-Check-Act (PDCA) cycle commonly used for quality control, project planning, and continuous improvement.

pdca Button
Jeffrey Dalto

Jeffrey Dalto

Jeffrey Dalto is an Instructional Designer and the Senior Learning & Development Specialist at Convergence Training. He's worked in training/learning & development for 20 years, in safety and safety training for more than 10, is an OSHA Authorized Outreach Trainer for General Industry OSHA 10 and 30, has completed a General Industry Safety and Health Specialist Certificate from the University of Washington/Pacific Northwest OSHA Education Center, and is a member of the committee creating the upcoming ANSI Z490.2 national standard on online environmental, health, and safety training.

Leave a Reply

Your email address will not be published. Required fields are marked *