Subscribe
 
Reading time: 10 min

The Serengeti Plain Fallacy: Fallacies that aren’t fallacies

WARNING: Still in draft
This article is unfinished, made public for feedback and contemplation.
A contrarian look at logical “fallacies” that maybe aren’t so illogical after all.


I never want to hear this narrative again:

Back when humans were living in the
Serengeti Plain or Saharan Desert or coming down from the trees…
it made sense for us to
do something the self-satisfied speaker laughs at as stupid but so graciously forgives a primitive species for doing.
This worked because we were just trying not to get eaten by a
lion or tiger; by the way there are no tigers in Africa, but don’t facts get in your way!
But, in our modern society, this is a fallacy! That’s why you have to
embrace buzzwords like “first principles” and “maximize expected value” and “Bayesian inference”…
to transcend your stupid “lizard brain” and be
an enlightened frontal lobe user, like the speaker.

You don’t know what it was like in [the only African region you can recall from David Attenborough shows], you don’t know how genetic pressures actually work, you can’t ignore the hundreds of generations since the rise of civilization, and not all those heuristics are “fallacies.”

“Fallacy” implies it’s dumb because it runs against cold, hard, scintillating, pure, perfect logic.

But I don’t agree with some of that perfect logic.

What if “Expected Value” is the fallacy, not “Loss Aversion.”

One of my pet peeves is the idea that we should always “maximize expected value,” which I believe is rarely the correct way to make a decision, but always generates a tsunami of arguments on Twitter.

Economists claim that “maximizing expected value” is what logical people do. It doesn’t bother those same economists that they cannot predict any major metrics of the economy or markets, while constantly issuing memos excusing why their models got it wrong (which they also can’t agree on). All while still calling everyone else irrational.

But if indeed “maximize expected value” isn’t so logical and rational after all, then so-called “logical fallacies,” might not be fallacies after all.

Let’s see why “expected value” might be the fallacy:

I invite you to play a game. The game is, we flip a fair coin. If it comes up heads, I will double your life savings. If it comes up tails, you lose all of your life savings. In other words, it’s like going all-in in poker with a 50/50 chance, except this is your actual life, not however many chips you brought into the casino. This is for everything.

How many people want to play that game? If you have very little in the bank, you might want to play, because it makes little difference either way. But among people who have spent years socking away a nest egg, few would take that chance.

From an “expected value” perspective, it doesn’t matter whether you play the game or not. The expected value is the same—zero1. But of course, it does matter whether you play the game, and different types of people will want to or not want to play the game, which proves that it matters. And I don’t think there’s anybody on Earth who thinks that characterizing this game as being “zero” is an accurate nor useful characterization.

1 The definition of “expected value” is the sum of all the outcomes weighted by the probability of each outcome. In this case, if your savings is s, the expected value is -s*0.5 + s*0.5 = 0

Now let’s really mess with the expected value people. The game is the same, except the probability is 55% that you double your life savings and 45% that you lose everything. Does that change whether you want to play the game?

Very few people would change their mind based on this very minor alteration in the game’s rules. Losing all of your life savings at 45% is essentially the same as at 50%, and if it was bad at 50%, it’s still bad.

But now the expected value of this game is positive2. So the “maximize expected value” people would say you’re being illogical if you opt out of the game.

2 EV = -s*0.45 + s*0.55 = s*0.1, so for example if our life savings is $100,000, the expect value of the game is $10,000.

In particular, you would say you are succumbing to the fallacy of “Loss Aversion.” This is the notion that we hate losses more than we love gains, and we’re allowing this fallacy to drive the wrong decision because we’re not maximizing our expected value.

But I beg to differ, and you probably do too. I don’t think loss aversion is illogical, and I don’t think making someone play this game because of some religious adherence to expected value is wise. In fact, I think expected value isn’t the right way to think about the game at all. Indeed, I would say those people are succumbing to the “Expected Value Fallacy,” which is you are using statistics that theoretically apply if you were to play the game millions of times, and where you’re allowed to keep the average result, yet applying that to one instance of playing the game, which is just wrong.


So, what follows is a set of so-called logical fallacies that I don’t agree are necessarily fallacies, starting with Loss Aversion that we just demonstrated.

And none of it is related to fleeing from lions.

Anti-Fallacies


Loss Aversion

Valuing the pain of loss greater than the joy of gain.

It took a lifetime of pain and sacrifice and luck to accumulate what you have. It is not irrational to be much more protective of losing it than you are greedy about getting more. Especially if the loss is catastrophic, as opposed to an experiment with 1/1000th of your money, where the loss is immaterial. Of course, in that case you’re happy to play, proving again that the so-called “fallacy” is only invoked when the loss is important.


Endowment effect

Valuing an object more, only because you possess it.

We are emotionally attached to things that we own. It makes sense: There’s an emotional investment in having made the decision, in shifting or cementing your identify as “a person who would have this object,” and how you believe that ownership will be perceived by others. All of this is real, tangible value.

You also presumably value the object more than its face-value. If you buy a ticket to a concert for $200, studies show you wouldn’t sell it for less than $300-$400. Is this because you erroneously value it higher because you “possess it,” or is this because the reason you bought it in the first place is that you value going to the show, so selling the ticket now also means selling the experience, which is definitionally worth more than the face value of the ticket.

Over time, we can also grow sentimental attachments; economic theory is incorrect if it asserts that sentiment has no value.


Sunk Cost Fallacy

Continuing an endeavor because of previously invested resources.

In long-term projects, commitment can lead to eventual success whereas “fail fast” actually ensures failure. The emotional and financial investments drive perseverance, which can be the key to overcoming obstacles and achieving long-term goals. Sometimes we pretend we’re so smart in avoiding so-called “sunk cost fallacy,” when really we’re justifying bailing out when times got tough.


Availability Bias

Overestimating the importance of information that is readily available.

In startups—and even in scaled-up companies—we often lack comprehensive or statistically-significant data. Extensive research is frequently impossible or impractical. Instead of getting mired analysis paralysis, scared to act on the data we have at hand, we have to act with what’s in front of us, and under conditions of uncertainty. This is not a fallacy but a practical necessity, and even allows us to move quickly and adapt. There are specific strategies for operating like this. It’s not a bias, it’s life.


Confirmation Bias

Favoring information that confirms existing beliefs.

Constant self-doubt and leads to inaction; frequent changes in plans leads to confusion. A leadership team who is always shifting priorities will confound the whole organization. A strategy that constantly changes, cannot be executed.

It’s also true that a strategy that never changes is wrong. There is a time to reevaluate plans and strategies, but that time is not “always.”


Overconfidence Bias

Having excessive confidence in one’s abilities or judgments.

93% of drivers believe they are above-average drivers. While mathematically they are incorrect, this also might give them the confidence to act. Being constantly worried about each driving decision would make them even worse.

Founders must be over-confident. The most likely outcome is failure, so you have to be overconfident to remain optimistic. At the beginning, both successes and failures look the same, so you have to be overconfident to push through. Before Product/Market Fit, you can’t know whether or when you’re going to hit Product/Market Fit, so you have to be overconfident to keep trying things, treating everything as an experiment, not a failure.

Overconfidence is not the same as being blind. Being confident in the vision, but skeptical about every detail, is how you find the truth.


Recency Bias

Giving disproportionate weight to recent events.

Early in a startup’s life, it’s doing most things wrong. Quickly reacting to what’s in front of you is one of the ways to iterate into getting things right. “This is how we did it at my last company” doesn’t matter, when that last company was 400x larger and 50x older and in a different industry. One of the few advantages a startup has is agility, and reacting to the latest information is the definition of agile.


Survivorship Bias

Focusing on successes even if they were due to luck, or share characteristics with failures.

I’ve been pointing out the problem of Survivor Bias in business advice for more than 15 years. That said, it’s not true that “you learn more from failure than from success.” From failure you see what didn’t work, but that doesn’t point the way to what does work.

Yes, successes are lucky, not just good. Yes, companies that failed often do similar things to companies that succeed, which suggests that those things didn’t “cause” success. Yes, often there’s just one or two most important things that caused the success, despite everything else they did, not because of it.

But which is more likely to work: Copying everything about a success, or everything about a failure? Of course the success, because not all of it was luck.


Herd Behavior

People tend to follow the actions of a larger group.

Your product and company should be different in some way that your target customers believe makes you the absolute best choice.

But in every other way, surprises are bad. I don’t want to have to calculate the bottom-line price in a unique pricing model, or decode the bizarre controls you created because you didn’t want to use menubars, or decipher non-standard icons for things like “copy” and “paste” because you wanted to be creative.

Kai's Power Goo.
What does "UnGoo" do?

Kai’s Power Goo.
What does “UnGoo” do?

Adhering to norms provides safety, understanding, control, and ease. Those are all desirable qualities in products and companies, except in the very few places where you make a genuine improvement on the status quo.


Status Quo Bias

Preferring things to stay the same rather than change.

Stability and consistency foster reliability and trust. A startup thrusts constant change upon its denizens; if you can keep some things constant and reliable, it gives everyone something to hold on to. If you have put thought and effort into your decisions, then it should take even more thought and effort to change that decision; it’s not a bias to honor our former selves, so long as new information hasn’t come to light that would have changed the those former decisions.


You know, back when humans were living in Statistics for Economists class in college, it made sense for us to pretend that the real world was an idealized environment with only two variables, no complex dynamics, and perfectly rational actors. This worked because we were just trying not to get eaten by a pedantic professor.

But, in our modern society, this is a fallacy! That’s why you have to use ideas and tools that make sense for each individual in their circumstances, thinking for yourself instead of parroting phrases off the internet you don’t really understand, running your own experiments instead of believing models and theories with more exceptions than examples, to transcend your lizard brain and be enlightened and strategic.

☞ If you're enjoying this, please subscribe and share this article! ☜