Sunday, July 13, 2025

My Review on Thinking Fast and Slow + Connections with other books and ideas

This is a summary of my personal notes on Thinking Fast and Slow. I took notes for every chapter and connected with other concepts from other books.

First this book has a LOT of redundant content and could have easily been half the size. But here’s the thing: the information is so incredibly valuable that I can’t give it anything less than 5 stars.

Kahneman exposes how our minds trick us in ways we don’t even realize. It’s a dense journey, but the takeaways fundamentally change how you see the world and yourself. This isn't just a book review; this is a full dissection of the concepts that stuck with me, complete with all the connections I made.

The Two Systems: The Lazy Master and the Eager Slave
Everything starts here. We have two systems. System 1 is fast, always active, and automatic. You can’t stop it from reading a sentence you see. It’s your gut reaction. System 2 is the slow, deliberate, energy demanding thinker. It’s what you use to solve a complex math problem or control your anger. You can feel the effort; your pupils literally dilate, a clear sign of cognitive load.

But System 2 is lazy. It gets easily overloaded, like in the gorilla experiment where you miss the obvious because you’re focused on something else. The craziest part is the rationalization. Most of the time, our "rational" System 2 isn't making a decision; it’s just stepping in to justify whatever impulsive bullshit System 1 already decided. It thinks it’s the hero, but it’s just the press secretary.

The Illusion of Understanding: We Believe Our Own Bullshit Stories
What really struck me was the illusion of understanding we all suffer from. We are story animals. We crave coherent narratives, a perfect example of the Narrative Fallacy. We create neat stories about why things happened, but we completely ignore what didn't happen, which is often more important.

This explains the Illusion of Validity. Our confidence isn't a measure of accuracy; it just reflects how coherent and convincing our internal story is. I saw this firsthand in the book's story about the Israeli army leadership test—they were super confident in a framework that was barely better than random chance. Confidence is a feeling, not a fact.

This bleeds into everything. We judge decisions based on their results (Outcome Bias) instead of the process. We slap a halo on successful people and horns on failures (Halo and Horn Bias), judging all their traits by a single outcome. A CEO gets credit for success that is largely luck, and we call him a visionary. And don’t get me started on business books. They’re mostly incentive driven garbage, selling success recipes that are just stories of luck and Regression to the Mean. The data shows a good CEO’s impact is only about 10 percent. The rest is noise.

Our brains even rewrite the past. Hindsight Bias means we update our view of the past with new information, making events seem obvious in retrospect. As I learned from A Theory of a Thousand Brains, our view of the past is constantly changing. The only way to see our real past ideas is to write them down, which is a huge reason I journal.

Our Brains Suck at Math: The Statistical Traps We Can't See
Our brains are terrible at statistics. We fall for the Law of Small Numbers all the time, seeing patterns in tiny samples that are just random noise. The Bill Gates Foundation did this when they invested billions in small schools because they had the best results, completely ignoring that small schools also had the worst results. They didn't use inversion to see the full picture. Smaller samples just have more extreme variance.

This is the same reason for the Linda problem. A detailed, plausible story about Linda being a feminist makes us think it's more probable she's a "feminist bank teller" than just a "bank teller," even though it’s statistically impossible. Our brain confuses plausibility with probability. It’s also why we’d value a set of 10 perfect cups more than a set of 12 that includes the same 10 perfect cups plus two broken ones. Less is more.

Then there’s Regression to the Mean. Extreme events tend to normalize. The story of the pilot instructors who believed yelling at a pilot after a bad performance worked was a perfect example. The pilot wasn't improving because of the yelling; his performance was just statistically likely to be closer to his average the next time. Our brains want a causal story, but sometimes it’s just math.

Heuristics and Biases: The Mental Shortcuts That Lead Us Astray
The Availability Bias chapter was eye opening. We use mental shortcuts, or heuristics, all the time. We overestimate the probability of recent or vivid events, like terrorist attacks, just because they come to mind easily. This can lead to an Availability Cascade, where media reports on a minor risk create public panic, which creates more media reports, until we’re making policy based on fear instead of facts. The apple cancer scare is a perfect example where the "cure" was worse than the disease.

And it has a bizarre side effect: asking someone to retrieve twelve examples of their own assertive behavior makes them feel less assertive than if you asked for only six. The difficulty of retrieving the last few examples dominates their perception. It’s so counterintuitive but makes perfect sense.

Then you have the Anchor Effect. That first number you hear in a negotiation has a ridiculously powerful pull, just like Chris Voss explains in Never Split the Difference. Our brains use it as a reference point and don't adjust nearly enough.

The Focusing Illusion is another big one: when we think about something, it gains immense importance. Ask people about their happiness, then ask about their marriage, and their overall happiness score will change. We overestimate how much material things will make us happy because we focus on them when we think about them, but we don't actually think about our car that much in daily life.

Prospect Theory: We're Not Rational, We're Relative & Afraid of Loss
The book demolishes the idea of the rational "Econ." We aren’t absolute; we are relative animals, as Charlie Munger would say. Bernoulli’s theory of utility was a good start, but it missed the most important piece: the reference point. Our decisions are driven by changes from our baseline, not absolute states.

This is the core of Prospect Theory. We feel the pain of a loss about two to three times more than the pleasure of an equivalent gain. This asymmetry explains so much. It’s why the framing of a question changes everything. When a problem is framed as saving 200 people, we become risk averse and take the sure thing. When it's framed as 400 people dying, we become risk seeking to avoid the sure loss, even if the outcomes are identical. Neuroscience shows this: the emotional choice lights up the amygdala, while the rational one requires the frontal cortex to work harder.

This also explains the Sunk Cost Fallacy and why we’re so bad at "keeping score." We don't just see money as utility; we see it as personal merit. We hate closing a "mental account" with a loss. This happened to me with crypto. I’d sell Bitcoin for a quick profit but hold onto losing micro coins, hoping they’d turn around. The rational move is to sell the losers and keep the winners. It's a lesson I'm glad I learned early. The fear of regret is a powerful, irrational force.

The antidote is to adopt a Broad Lens. A single coin flip to win $200 or lose $100 is unattractive. But if you were offered that bet 100 times? You’d take it every time. As the Stoics taught, and Taleb preaches, you have to see each decision as part of a larger portfolio. That's why I'm adding a question to my decision template: "If I could make this decision 100 times, how would that change my choice?"

Intuition: Experts, Formulas, and When to Actually Trust a Gut
This section was incredible. First, the chapter on Intuitions vs. Formulas might be one of the best I’ve ever read. Study after study shows that simple algorithms, using just five or six equally weighted variables, consistently outperform the intuition of human "experts." They even beat complex regression models. Why? Simplicity is robust. Fewer dimensions mean less noise, a perfect example of Occam's Razor.

So when can you actually trust expert intuition? Only under specific conditions. Intuition is just rapid pattern recognition, and it only works in an environment that is regular and predictable, with endless opportunities for practice, quick and clear feedback, and real skin in the game. A chess master or an anesthesiologist? Trust their gut. A stock picker or a political pundit predicting long term outcomes? Not so much. Their environment is too random and the feedback loops are terrible.

The Two Selves & Well-Being: Your Memory is a Lying Tyrant
This concept ties it all together. We have two selves: the Experiencing Self that lives in the moment, and the Remembering Self that tells the story afterward. And the Remembering Self is a tyrant. It doesn't care about the duration of an experience; it only cares about the peak (the most intense moment) and the end.

This is the Peak End Rule. It’s why a longer colonoscopy with a less painful ending is remembered as "better" than a shorter one. It’s why my mom was right about public speaking: you HAVE to optimize the start and the end. I used to optimize the beginning and middle of my pitches, but fuck the endings. No more.

It’s also why I started journaling. I realized my memories were mostly false narratives built around emotional peaks and how things ended. This might even explain why a 9 to 5 job feels so soul crushing, as the "U-Index" of suffering shows. There are no peaks, just a flat line. Having a boss sucks. A startup, on the other hand, gives you terrible lows but also incredible highs, and your Remembering Self loves those peaks. It's a better way to live for "memory happiness."

As Seneca said, “A life is like a play: it’s not the length, but the excellence of the acting that matters.” Our Remembering Self is the ultimate critic of that play.

Yes, Kahneman could have been more concise. But the insights about our cognitive biases are so fundamental to better thinking that the redundancy is forgivable. This book will change how you see your own decision making process, and that’s worth wading through every single page.


No comments:

Post a Comment