In engaging prose and with practical examples and anecdotes, an eye-opening look at human reasoning and essential reading for anyone with important decisions to make.
Have you ever: 鈥� Invested time in something that, with hindsight, just wasn't worth it? 鈥� Overpayed in an Ebay auction? 鈥� Continued doing something you knew was bad for you? 鈥� Sold stocks too late, or too early? 鈥� Taken credit for success, but blamed failure on external circumstances? 鈥� Backed the wrong horse?
These are examples of cognitive biases, simple errors we all make in our day-to-day thinking. But by knowing what they are and how to spot them, we can avoid them and make better choices-whether dealing with a personal problem or a business negotiation; trying to save money or make money; working out what we do or don't want in life: and how best to get it.
Simple, clear and always surprising, this indispensable book will change the way you think and transform your decision-making-work, at home, every day. It reveals, in 99 short chapters, the most common errors of judgment, and how to avoid them.
Rolf Dobelli is a Swiss author and businessman. He began his writing career as a novelist in 2002, but he is best known internationally for his bestselling non-fiction The Art of Thinking Clearly (2011, English 2013), for which The Times has called him "the self-help guru the Germans love".
If you had lots of time (and interest in becoming aware of your cognitive biases), you should read Thinking Fast and Slow by Daniel Kahneman, Predictably Irrational by Dan Ariely, everything by Steven Pinker, Fooled by Randomness and The Black Swan by Nassim Taleb, and others. But since not everyone has the time and interest, instead read The Art of Thinking Clearly. This book has 99 short chapters (all of them are almost exactly 2.5 pages) that cover the major hiccups in our thinking process. A book like this is so much better than any self-help book or positive thinking/inspirational fluff. Metacognition - thinking about how we think - is underrated. Learning more about biases and heuristics is so important that I am going to push this on my kids when they are a little older. Only bad thing about this book: he didn't need 99 biases/"syndromes." Some were similar enough they could have been combined. If people just learned the top 20, they would be in better shape. By including so many, some of the important ones got lost in the lesser known and lesser experienced ones.
Nice packaging and design may give this book an aura of credibility. They certainly worked; I skimmed a few pages of it and bought it, thinking I would learn important lessons that I wouldn't get from other books about critical thinking. Alas, that won't be the case since the book reads like bull in a china shop; Dobelli massacres the art of critical thinking and puts in its place a Frankenstein doppelganger called cynicism and uncritical use of anecdotes.
Let's take for example: Lesson# 19 "The Dubious Efficacy of Doctors, Consultants and Psychotherapist: Regression to Mean" in which Dobelli includes anecdotes about an investment adviser at a major bank who does 'rain dance' whenever his stocks performed badly. The anecdote claims that "things always improve afterwards". Then, Dobelli juxtaposes that anecdote to the one about a young handicap-12 golfer who would book time with an instructor to improve his game. This time, Dobelli doesn't mention any mumbo-jumbo thing the golfer does so we can safely assume that the golfer did try to learn new techniques or improve upon his weaknesses so he can play better. Then, he put in other details for the chapter and wraps it all under 'regression to mean'. Well, really? "Rain Dance" claimed to "always" work comes under "regression to mean" and is considered similar to people practicing and improving their skills through hard work? I mean, sure, the adviser's behavior is an example of 'cognitive bias' but does it really come under that category (rather than one of the other 98 in the book)?
The reading of this book will produce a sort of gut-feeling nausea that usually tell you something is wrong. Sure enough, you'll find plenty of 'not quite right' details in the book such as the use of anecdotes that don't fall in the right categories, the lack of footnotes, and critical analysis of the anecdotes. There's also the strange section of "A Note on Sources" that neither Dobelli nor his editor and agent bother to weave into the chapters as properly written end notes. Then, comes the interspersing of critically right and critically wrong information in each chapter; this book is the 'Da Vinci Code' of books on critical thinking.
The book seems credible and right but it's neither. It's even more frustrating when I think I've spent CAD$16.99 on someone's sloppily written personal-notes-turned-into-commercially-sold book. Readers will get more out of books on the basics of critical thinking rather than this one. In fact, the book is dangerous since it "poisons the well" on so many subjects it fails to critically discuss about. The only usefulness of this book is in challenging readers to use their critical thinking skills to identify its biases and fallacies.
On that note, I really have to say: I don't understand the reason many people praise and give good reviews for this book.
Die Kunst des Klaren Denkens = The Art of Thinking Clearly, Rolf Dobelli
The Art of Thinking Clearly is a 2013 book by the Swiss writer Rolf Dobelli which describes in short chapters 99 of the most common thinking errors - ranging from cognitive biases to elements like envy and social distortions.
My learning from the book: (1) Never underestimate the hard work and lower probability of success, just because we are shown more successful people than many more actual failures (2) Confirmation bias is the mother of all misconceptions. It is a tendency to interpret new information so that it becomes compatible with our existing theories. Warren Buffet has seen people losing money with this because they ignore facts which contradict the theory in the mind of the investor. Dis confirming evidence must be seeked out to beat this theory. e.g. what is next in the sequence 2-4-6-8-... (3) Calamity of Conformity - If you ever find yourself in a tight unanimous group, you must speak your mind, even if your team does not like it, even if it means risking expulsion from the warm nest. And if you lead a group, appoint someone as devil's advocate. He or she will not be the most popular member of the team, but definitely the most important. (4) Induction - Send an email to 10,000 people with stock market prediction by dividing them into 2 groups - telling reverse prediction to each group. Prediction for one of the groups will come true. Send a new prediction to the 5,000 whom you predicted correctly earlier - again after dividing them into 2 groups...carry on like this, and the last 100 would consider you as a genius. People get inducted into a decision based on history without thinking logically. (5) Loss Aversion - The fear of losing something motivates people more than the prospect of gaining something of equal value. (6) When it comes to compounding, don't trust your intuition - you have no idea how powerful it is (7) It is not what you say, but how you say, that's important. 98% Fat Free product seems more healthy than a product with 1% Fat. (8) If you are not a part of the solution, you are definitely a part of the problem. There is no 3rd category of passive onlookers. (9) Follow your passion even if you have to do away with part of your income for that (10) Whenever you are dealing with averages, be careful of the distribution behind it. A Bill Gates monthly income in a group of 50 ordinary citizens can give an extremely misleading average. (11) Money does not always motivate. It works as a motivation only in companies where employees work for only money. (12) Money comes wrapped in emotions. Monet won incidentally, as against earned through hard work, is more likely to be spent erratically - though it is illogical because the money is the same. This can be prevented if you have a clear financial plan with you. (13) Self Control drains your energy, and therefore you need a lot of energy if you want to exercise self control. (14) Presence of something is more noticeable and valued than its absence. e.g. presence of disease than its absence. OR getting off a plane and not noticing that it did not crash. (15) NEWS is to mind, what sugar is to body, appetizing, easy to digest - and highly destructive in the ling run
It can be useful as a starting point for a list of cognitive biases. However, it is mere plagarism of other texts.
Dobelli uses examples taken directly from other sources, changes the names of characters and wording slightly, and uses them as if they were his own. Why not just quote from the original text?
Also, some of his examples are so diluted and simplified that they are actually WRONG. One of the most glaring ones is his water treatment example for "Neglect of Probability". A: reduces probability of contamination from 5% to 2%. B: reduces probability of contamination from 1% to 0%. He says that method A is 3 times as good!!! HUH?!?! Method B removes 100% of the contamination..... that's a pretty good option right there.
My advice is to take the chapter headings of the book as a list. Then go through the sources listed... you will get 100x more understanding by reading Cialdini, Munger, Taleb, Kahneman.
This book is the dead tree equivalent of a BuzzFeed post. Its title could be 鈥淚 Got 99 Cognitive Biases But a Psychology Degree Ain鈥檛 One.鈥� Or maybe not.
Rolf Dobelli enumerates 99 thinking errors, or cognitive biases, in The Art of Thinking Clearly, dispensing as he does tips for leading a more rational, less error-prone life. Anyone who has done even the least amount of reading in this subject will recognize many of the cognitive biases that Dobelli describes here. Unlike most popular cognitive psychology books, however, this book makes no central argument and does not examine these biases within a larger context. It is literally just a list, with extended descriptions, of the biases. At times, Dobelli occasionally ascribes the bias to some evolutionary origins, and he will quite often cite some interesting experiments conducted by psychologists (he is not, by the way) that revealed or provided insight into the bias in question. In his introduction Dobelli explains that the book began life as a personal list kept for his own benefit, and I can believe that.
Dobelli covers 99 biases in 300 pages, so he can鈥檛 spend much time on each bias. Not every bias is as interesting or worthwhile as the next. But from the very beginning, I was frustrated by the brevity of each chapter. Just as I read something that intrigued me, Dobelli shepherded me on to the next bias like some kind of frantic tour guide worried that we won鈥檛 have time to see all of the art. Please stay with the tour, no cameras.
I wanted to be mollified by dazzling prose, but I had to settle for somewhat dull attempts at wit. I wanted to be satisfied with lucid, if too concise, explanations of these biases, but I had to settle for somewhat tepid attempts to demonstrate these biases without getting drawn into the bigger discussions of the cognitive and behavioural science that underlies them. Dobelli ties his own hands here, to poor effect.
To be fair, it is clear that Dobelli is well-read in this field. He has done his research (even if the 鈥渘ote on sources鈥� section frustratingly places the sources under headings by the bias name but not the chapter number, and there is nary an endnote to be seen). It鈥檚 clear, judging from the number of times he quotes from or references Thinking, Fast and Slow, that he has been heavily influenced by the work of Daniel Kahneman. In fact, one could say that The Art of Thinking Clearly is little more than attempt to distil the biases and only the biases mentioned in Thinking, Fast and Slow and similar such books.
The thing about blog posts like this is that they seldom linger in one鈥檚 short- or long-term memories. They are space-filling exercises, attempts to get eyeballs to the page and clicks on ads. It doesn鈥檛 work well in book form; I don鈥檛, as a general rule, enjoy books of lists all that much. There are some exceptions for lists compiled and enumerated in a hilarious manner, but that isn鈥檛 the case here. Yet with the cognitive biases removed from a larger context and reduced merely to a checklist of errors to avoid, Dobelli robs them of their greater meaning.
So if you鈥檙e truly interested in this subject matter, why not just skip The Art of Thinking Clearly and go read Thinking, Fast and Slow? I have. It鈥檚 much better than this book and much more informative, and it鈥檚 written by an actual psychologist. This book, like the BuzzFeed post it resembles, is a pale imitation of something more meaningful and accomplished. Imitation flowers have their place, but life is too short to waste it on imitation books.
Now I understood why it has become so popular. Frankly, the author has done a great job here by surveying the wide field of thinking errors. Even though some of the information is already known, there are still some issues highlighted that are very important to us that we avoid or try not to perceive most of the time. I think there is a lot to learn from this book. I will definitely read it a second time in a few months.
If you're looking for a book to help you get ahead, or improve you as a human being, don't look here; but if like me, you want to see how a book of such reputation with no scientific ground, or even much common sense, can be so popular among some people, get this book and start reading.
(I tried not to include any spoilers, so read with peace of mind if you have it in your to-read list.)
The fact that this is a terrible book became known to me very early in the book, however I decided to keep reading and finish the book, mainly for this reason, plus interest in some of the short stories in there. One thing that this book had for me, was a short list of some useful books to get to once I'm done with this one.
There's tons wrong with this book, and I don't want you to have to read 5 pages here, so let's just get to a few major reasons and move on:
1. My main issue is the writer himself, read a few of the pages, and you soon come to understand that Rolf Dobelli doesn't know what he's talking about at all. He doesn't have any new knowledge to offer, and he doesn't even offer a better way to understand already known knowledge, regardless of how much he tries.
2. Cynicism is present all over the book, while that might be nice for a pessimist, it definitely isn't for an optimist like me, not even for non-pessimists all over the world. I could understand this if that cynicism was at the very least standing on some facts, but even that is not the case.
3. Plagiarism, if you looked at other reviews, you already know this one. Fortunately I've checked out "" by Daniel Kahneman a long time ago, and could see a lot of the places where Rolf Dobelli used Daniel's examples and researches throughout the book, without even slightest pointer to him, or his best-selling book, that's a big NO in my book. This reason is enough for me to blacklist the writer for life. (Not to mention the long list of stealing that he did from Nassim Taleb, just google it to see what I'm talking about)
4. Lots of the data in the book is incorrect, and gladly we have math and science to back us up on this. My biggest question at these points was, "How this guy can be an investor, and work with money, if he doesn't even understand simplest statistics, and how someone must use math?"
5. And probably the most clear one to every reader, the writer is a hypocrite. He uses at least half a dozen of methods that he explicitly "orders" you (yes, "order", not "suggest" or "recommend") not to use ever. While this is clear at some points in the book, unfortunately some people find a way to overlook it.
---
I could refer to 40 or 45 pages specifically and show you the problems I had with this book in details, but then no one would read this review, and so this couldn't help anyone avoid it, or be clear enough before reading it, so I decided to just point to a few generic and major issues, and not talk too specifically here.
Why I didn't rate the book 1 star then? Well, the book has a list of errors, 99 to be exact, and while I don't agree with about half of them, the list can still be useful for you, maybe you can't get the required information for each error in this mostly stolen book, but it gives you a starting point to follow from other books, like Daniel Kahneman's book, and start using those points in your life. Just make sure you apply that final step before changing anything in your life based on this book.
The book also had a couple of interesting short stories and pointers that I could use as clues and find more about later on. I wrote all of them on a small piece of paper and will get to at some point.
I guess you have 2 options if you're interested in this book, first of all, you could read the full book, but close off your mind and make sure you don't treat this pile of information as correct until you spent quite some time filtering it; and secondly, you could just look at the list of topics at the beginning of the book and see which ones you can agree with at that very moment, and then jump to the end of the book and check the resources, and start reading the resources instead of the book itself to get to those interesting topics; after all, the writer didn't have anything useful to add here.
I personally don't see the time I put to read this book as wasted, in fact I see it as investment into better filter wrong info in my journey, and also to understand better, the people that base their life on such books, as I believe I can do that a little bit better than before I started reading this.
Thank you for spending time reading my review, will see you on the next book. :)
I bought this book just because I saw Taleb eulogizing the book right on the book cover and so I fell for it. If you have read The Black Swan, The Impact of the Highly Improbable by Nassim Nicholas Taleb then I would strongly re-commend do NOT go for the book but if you haven鈥檛 dipped into the ocean of Taleb鈥檚 thoughts then this book is for you. More or less, The Art of Thinking Clearly harps on the same line of thoughts, as is the case with Black Swan. Each chapter in the former case is like bullet points of latter鈥檚 approach. Both talks about reverse engineering of thought process, counter-intuitiveness and randomness.
The book is a database of brief explanations of occurrences. It is quite a light read but I personally don鈥檛 think it can help anybody with decision making. The book surfaces things that are already present in everybody鈥檚 mind and some of them even have recognized them without reading the book even but had it actually helped anyone in decision-making, I am still doubtful.
In one of his chapters, he talks about the inability of humans to comprehend probabilities well, I completely differ with this opinion of his. People do take probabilities into consideration, I mean, there are major chunk of individuals who see life as grey and not only black or only white. For these many people, there is always a space called the benefit of doubt and they leave it open while dealing with people around them whether in office or in life as a whole.
I am a person with an average intelligence yet I feel this book is far from satiating my intellectual appetite. I love books that make me think even when am not reading but this international best seller is not for me.
This book is a list of 99 common thinking errors and cognitive biases. Some of these you've probably heard many times before, but many will likely be new. I found it a quick, fun, interesting read, but it has 3 major flaws:
1. Because it's just a list of 99 disconnected items, with no common "story" to tie them all together, you will forget the vast majority of it shortly after finishing the book.
2. The book will tell you about the thinking errors, but not the solutions. Granted, there is value in being aware of the thinking errors in the first place, but without a concrete plan of how to avoid the errors, there isn't much actionable to take away from the book. In short, don't expect to be thinking all that much more clearly when you're done reading.
3. The book leans very heavily on a few other authors: especially Robert Cialdini ("Influence: The Psychology of Persuasion"), Daniel Kahneman ("Thinking, Fast and Slow"), and Nassim Nicholas Taleb ("The Black Swan"). Dobelli gives you the TLDR version of these other authors, which loses much of the nuance and value. My recommendation would be to skim Dobelli's book, figure out which topics you find interesting, and go back to the original source material for a deeper, more fulfilling read.
Despite these problems, I still found a few fun ideas/thoughts/concepts that I took down as notes:
* How to make people believe you can predict the stock market. First, email 50,000 people one stock prediction and a different 50,000 people the opposite prediction. A week later, one of those two predictions will be correct. Now repeat the process with the group where your "prediction" was correct: email 25,000 of them a new prediction, and the other 25,000 the opposite prediction. Keep repeating this process for several weeks, and at the end, a small group of people will believe that you made a series of seemingly impossibly correct predictions in a row. These people will think you're a genius and be readily willing to give you all their money for investment.
* The concept of "social loafing." When someone is working on something alone, they typically work harder than if they are working on the same thing in a group. For example, in a tug of war, the more people you add to the repo, the less hard each one pulls. The less your individual contribution is noticed, the less effort you put in. This is a critical lesson for management and team building.
* Omission bias. Most people believe that you are less "culpable" if you allow something bad to happen due to inaction rather than action. For example, shooting someone is seen as worse than letting someone die. Building no new products and going out of business because the market changed is seen as less bad than trying to build a new product and failing.
* Hyperbolic discounting. We value instant rewards much, much more than the same or even larger reward, but with any sort of delay. For example, if I let you pick between earning $1,000 12 months from now or $1,100 13 months, most people would take the latter. After all, what's one more month after waiting 12? But if I let you pick between $1,000 right now and $1,100 one month from now, almost everyone would pick the former. It's exactly the same one month difference, but the possibility to get something now carries a lot of weight.
* The power of because. People will forgive much more readily if you give a reason (i.e., include a "because" in your speech), even if it's not much of a reason at all. E.g., people were much more willing to let someone cut in line at a copy machine when they said "Could I cut in line because XXX" rather than just "Could I cut in line." The XXX itself barely mattered: "because I'm late for class" worked more or less as well as "because I need to make copies" (i.e., a meaningless reason). The mere presence of "because" was the important part.
* The Will Rogers phenomenon (AKA stage migration). It comes from his joke: "When the Okies left Oklahoma and moved to California, they raised the average intelligence level in both states." Here's an example where this sort of thing can cause problems in real life: cancer is often grouped into different stages (e.g., stage 1, stage 2, stage 3), and we measure average survival rates for each group (e.g., stage X patients survive on average Y years). It turns out that if we develop better cancer detection techniques that catch the disease even in people that otherwise seemed healthy, we'll end up with more healthy people in stage 1. This will increase the average survival rate for patients in stage 1, even though treatment hasn't actually improved in any way!
* Effort justification. If we work hard for something, or suffer for it, we value it much more. This is one of the reasons hazing and initiation rituals are so prevalent in groups: the pain you go through to join the group makes you value the group much more. This also explains the "Ikea Effect," where customers value their Ikea furniture more because of the effort they had to put in to assemble it. And this also partially explains the "not invented here" syndrome, where companies prefer their internal, home-spun solutions, simply because they took part in building them, and not because those solutions are actually better than the alternatives.
* The sleeper effect. We forget the source of a message more quickly than the message itself. For example, you see a political campaign ad with a negative message about a candidate. Initially, this has little impact on your opinion of that candidate, as you know that message was paid for by the opposition, which is obviously biased. However, a while later, the negative message about the candidate is likely to stay in your mind, whereas you may no longer remember the biased source of the message, and therefore, it'll start to affect your opinion. Advertising likely takes advantage of this effect too: when you first watch a commercial touting the benefits of a product, you largely ignore it, as you know the ad is obviously biased and trying to sell you something. But a while later, you'll remember the product benefits, but not necessarily where you heard them, and you're more likely to buy the product.
* The confusion between risk and uncertainty. Risk is when you can predict the probability of various outcomes. Uncertainty is when those probabilities are totally unknown. We can calculate risk and make informed decisions about it; we cannot do the same with uncertainty.
* The house money effect. People tend to categorize money, which makes no sense, as all money should be the same to us. For example, a blackjack player goes to Vegas with $500 and plays a very deliberate and discplined strategy. But then, if they suddenly won $500, they might treat that as "house money" and start betting it wildly. This makes no sense, as that $500 is no different than any other $500, but we mentally put it in a different bucket. The same happens with investors who suddenly get a big payout, and instead of following their usual, disciplined investing approach, they buy a bunch of high risk stocks. This is also why many services give you "free credits" when you first start: you'll end up using these free credits in a different way than you would've with your normal money, which gets you used to spending more money on that service.
* The idea of strategic misrepresentation. That is, lying that is socially acceptable. For example, women wearing makeup, or a rich person driving a fancy sports care, or a contractor promising a short timeline just to get the deal signed.
* The effect of TODO lists and planning. If you have a long list of TODOs on your mind, it leads to a lot of anxiety. It will actually be hard to focus on anything else until those TODOs are all done... Except in one case: if you come up with a clear, solid plan for getting those TODOs done, studies show that it significantly reduces anxiety and lets you clear your mind.
FYI: I won this book from goodreads Giveaways, but that in no way influenced my review.
The Art of Thinking Clearly presents a bunch of anecdotal evidence to support commonly known fallacies in logical thinking. You know that hindsight is 20/20, we cling to our narratives, and think we'll be like the models in makeup ads if only we buy their product, plus a bunch of other semi-obvious ways in which we end up making bad decisions (or poorly rationalized flukes that still turn out okay). This book *might* be the reminder you need to think critically about what assumptions and misconceptions you are basing your decisions on. However, if you're already a critical thinker you probably won't learn too much from this book. Also, it doesn't really seem academically researched enough to be otherwise worthwhile. If it was more humorous it would at least make the obviousness more palatable.
To its benefit, you will almost definitely find at least one logical fallacy within that applies more to you personally (the, "Oh, I didn't realize it, but I definitely do that!" moment), and I suppose there's a chance that it may make a huge difference in your life. Also, it's a pretty quick read, with separate 'chapters' (a page or two) for each fallacy. So readers who prefer informational shorts over long form compositions will appreciate the format.
Wieder viele inspirierende Gedanken, manches ist mir aber zu extrem. Au脽erdem wiederholen sich viele Aspekte in seinen B眉chern. Trotzdem sehr wertvoll f眉r mich und ich konnte wieder Vieles daraus mitnehmen.