• Skip to primary navigation
  • Skip to main content
52 Aces

52 Aces

Learning, competition and capitalism

  • Start Here
  • Books
  • Courses
  • Newsletter
  • Writing
  • Reading List
  • About
  • Contact

Videos

Losing Games

Ace Eddleman

This is part of my 5 Minute Concepts series, which is designed to help you understand fundamental concepts about subjects like learning, memory and competition in the shortest time possible. Each episode is available in video format on my YouTube channel and audio via my podcast. If you prefer to read, the transcript is below.

Want to know when new content shows up? Sign up for my newsletter here.

Much of what I create revolves around the idea of competing intelligently. My overall hypothesis about competition is that most people do it haphazardly, and expect their own intuitions — mixed with a recognition of incentives — to carry them to victory.

Thinking this way is a serious error. It leads to making the same mistakes over and over again, and creates situations where meaningful learning takes much longer than it should.

What’s more important, in my opinion, is that we recognize just how competitive the world is. Competition exists at all levels of life, all the way down to single-celled organisms. Competition is a key component of evolutionary biology, and there isn’t any form of life on earth that can escape this dynamic.

There’s competition for money, competition for status, competition for relationships. There’s competition everywhere.

Life is competition and competition is life.

With all that being said, there’s an idea that’s just as important to keep in mind: sometimes, you’re playing a game you can’t win — no matter how intelligent you are or how hard you work. When you find yourself in this kind of game, what can be called a losing game, you need to exit that game as fast as you can.

Recognizing losing games like this is a skill in and of itself, one that many people find hard to develop. It’s particularly common in American culture, where we’re constantly told that hard work is the answer to all of life’s problems.

An extreme example I like to use is professional basketball. The first requirement for playing basketball at the pro level is to be very tall, which is something you can’t train for. You’re either born with tall genetics or you aren’t.

This can be a hard pill to swallow for people who don’t hit those height requirements but love the game enough to dedicate their lives to it. Someone who is only average height can spend every waking hour refining their game, doing everything they can to get better, and still come up short.

The problem in this situation isn’t that the player isn’t committed or intelligent enough. It’s just not a game they can win — and there’s nothing they can do about that.

Instead, this same player could find another way to be involved with the game. Maybe they could find work as a talent scout, or a commentator, or a sports writer, and still play for fun in recreational leagues.

Maybe they’re an exceptional programmer or mathematician, and that could allow them to build some kind of technological product that is intertwined with basketball.

Those are all winnable games for this fictional person: they offer odds with large payoffs and none of them have requirements that are impossible to train for.

What tends to drive people like this crazy is the search for glory. They want to do what’s most admired in society, like playing a sport professionally.

The irony is that wasting time on paths like this more often than not generates an excessive amount of unnecessary misery. Most people don’t really know what they want, they just think they know, so they waste their time pursuing goals that other people or society set for them.

That same player who wants to be a player more than anything might find more fulfillment in an auxiliary role than they can estimate.

Instead of wasting years pursuing a professional playing career that ends badly, they should find something else that might even end up being more fulfilling (or even lucrative).

Sometimes you can create the game yourself, and sometimes you have to go play someone else’s game. Either way, you should be doing this kind of analysis on a regular basis. Every now and then, stop and ask yourself: Is this a game I can win?

I’ve failed at this more times than I can count, and it’s cost me dearly on a few occasions. Hopefully you can heed my words and not make the same mistakes I have. Don’t play games you can’t win — find a place where you can play with favorable odds, and then throw yourself into that.

Autonomy

Ace Eddleman

This is part of my 5 Minute Concepts series, which is designed to help you understand fundamental concepts about subjects like learning, memory and competition in the shortest time possible. Each episode is available in video format on my YouTube channel and audio via my podcast. If you prefer to read, the transcript is below.

Want to know when new content shows up? Sign up for my newsletter here.

If you ask most people what they want out of their work lives, there are two answers that appear to be tattooed on the inside of their skulls by popular culture:

  1. Money
  2. Making a difference

The priorities might be switched, but these are the two most common replies. It’s not a surprise that people pick these two, because A) we need money to buy things like food, shelter, and pleasurable experiences, and B) working a job where you feel like you aren’t making any sort of dent in the world is a soul-crushing experience for most non-sociopaths.

Neither of these answers are wrong — they’re just incomplete.

When someone gives these answers, worth asking a follow-up question: What is the common, fundamental thread between the two of them?

A common answer to this question is “meaning.” That’s a little too whimsical for my tastes, as meaning strikes me as an ideologically-charged construct that is borderline impossible to define. Even if meaning is given a quality definition (which I haven’t seen yet, but I’m open-minded), it still misses the target.

Instead, the real answer is (in the vast majority of cases), autonomy. We want to be able to determine how we spend our limited time on this planet. In short, we don’t just want to survive — we want to survive on our terms.

Nothing makes us more miserable than having someone breathing down our necks, telling us what to do every day.

And, unfortunately, that’s what most of us end up doing with the bulk of our lives.

We give up a lot of autonomy in the pursuit of money, which is both sad and ironic given that we’re working for the thing that is enslaving us in many cases. The fantasies about freedom we feed ourselves all seem to revolve around money, and not about autonomy.

This isn’t entirely misguided: within a capitalist framework like the one we live in, money can (at certain amounts) provide a great deal of autonomy.

If you suddenly have 50 million dollars in the bank, you can tell your boss to go fuck himself and proceed to jump into a pool of champagne.

Or you could politely hand in a resignation letter and shake everyone’s hand on the way out of the office. Or you could just never show up, never respond to an email or call, and laugh about leaving your former colleagues in the dark.

That’s the power of autonomy. You get to decide what you do within any given moment, so life becomes a “choose your own adventure” story instead of a constant context switch between what you want to do and what someone else wants you to do.

Making a difference and the search for meaning both fall into the autonomy bucket as well, because both are activities that you judge on a subjective basis. Maybe your definition of making a difference is volunteering at a local animal shelter, or creating a documentary about a local endangered species.

Searching for meaning might involve spending your days reading big, complex books and then going for long walks on the beach. Or maybe it dawns on you that nothing has meaning and you turn into a shameless nihilist.

This is the power of living on your own terms. You get to decide what you want to do. The world is what you make of it.

From this point forward, you should evaluate the different opportunities you have in life based on how they’ll affect your ability to operate autonomously. But, as I’ve been alluding to, this can be a monumental task.

Quite a few people make a trade-off of money for autonomy, and discover that it makes them miserable. Investment banking is a good example of this, as bankers are paid tons of money but often work soul-crushing hours with very little autonomy.

Many of them would likely be much happier making a fraction of their incomes engaging in activities that they actually want to be engaging in.

Working your face off while you’re young in hopes of some autonomy payoff in the distant future is another landmine many people step on. These people look back when they’re older and realize their best years were spent in service of someone else’s desires — and there’s no going back.

I know it’s hard with how competitive the world is now, but try to make an effort to think less about the monetary rewards you might get from doing something. Instead, reorient your thinking towards what you can do that will give you the best chance at living an autonomous life.

You might be surprised how little it costs.

The Exploration-Exploitation Dilemma, Simplified

Ace Eddleman

This is part of my 5 Minute Concepts series, which is designed to help you understand fundamental concepts about subjects like learning, memory and competition in the shortest time possible. Each episode is available in video format on my YouTube channel and audio via my podcast. If you prefer to read, the transcript is below.

Want to know when new content shows up? Sign up for my newsletter here.

Transcript:

I’ve written about the exploration-exploitation dilemma before, but only in a long-form essay format. Since I think this is such a critical concept, and I realize that not everyone has the time to read a big essay, I’ve created this simplified explanation.

Just a warning: like any other 5 Minute Concepts piece, there’s always more to the story. I’m just trying to give you the most important parts in 5 minutes or less.

Anyway…

Let’s start with a stripped-down definition: the exploration-exploitation dilemma is the choice we all have to make between learning more or taking action with the knowledge we already possess.

Learning more is exploration, acting with current knowledge is exploitation.

With either action you’re trying to find some way to maximize what’s often referred to as “reward,” or some end-state that you find desirable.

The reason this is a dilemma is simple: you can’t explore or exploit exclusively and win in the long run.

If all you do is explore, you’ll never take action in the world — which means you get a predictable payoff of exactly zero. There isn’t much to be gained from passively gathering information until you die.

On the other hand, taking action without learning anything is also a long-term losing strategy. You do get some kind of reward by exploiting a known path, but that means you’re giving up any chance at a higher payoff that might be staring you in the face without you knowing about it.

The real kicker here is that exploring is what drives the value of exploitation, and vice versa. You need to explore in order to find good paths for exploitation, and you need to exploit in order to get a reward for your exploration. Both actions are dependent on each other.

What you’re balancing in either case is opportunity cost. You have a limited amount of resources, such as time and money, to work with over the course of your life. If you explore, you’re by default not exploiting, and vice versa.

Consider this example: Let’s say you’re scrolling through Netflix, looking for something to watch for the new couple of hours.

You notice that a movie you’ve seen a dozen times is one of the choices and consider watching it. Right next to that is a movie you’ve never seen before.

Choosing the movie you’ve seen provides a specific emotional payoff for you. You know all the best parts and you’re well aware of how the entire experience will make you feel.

Choosing the movie you’ve never seen means taking a certain amount of risk. There’s an unknown payoff for watching this new movie, and it might end up being a waste of two hours. Those two hours will be gone, never to return.

But you might also discover a new favorite movie, or genre, or director, that you never knew about.

It’s easy to get sucked into either extreme. I’ve known people who spent their whole lives reading, accumulating a veritable library worth of knowledge in their head, but never tried to do anything with it.

And, of course, I’m sure we both know people who have never read a book or stopped to think for even a moment about whether their beliefs and actions should be altered in some way.

While this is an unsolved problem (and trust me, many people have tried to figure it out), there are some good rules of thumb to run with. First of all, don’t favor a binary approach. Only exploring or only exploiting doesn’t work in the long run.

Secondly, it pays to spend a lot of time exploring early on and then shifting more and more to exploitation over time. But — this is critical — you never stop exploring completely. For a person in the real world, exploring should always be part of your strategy.

There’s always some accommodation made for learning new things. This is known as the epsilon-decreasing algorithm, and, if you just want a simple heuristic for managing this dilemma, it’s a pretty good place to start.

Third, there are always inflection points where it makes sense to shift from one to the other. Sometimes it’s a moment where you realize you’ve finally reached a level of knowledge that grants you a new level of competence and the time to utilize it has come. Passing a professional exam is a simple example of this.

Other times you might suffer a bitter defeat and receive an unfiltered signal that it’s time to explore. If a big project you’ve been working on fails, for example, you might need to go back to the drawing board and evaluate how to improve for your next attempt.

I could talk about this for hours, but in general I want you to understand this: figuring out how to spread your time between exploration and exploitation is perhaps the most important problem you’ll ever face.

Don’t push this into the background — be conscious and deliberate about it. Doing that might just change your life in ways you never saw coming.

Photographic Memory

Ace Eddleman

This is part of my 5 Minute Concepts series, which is designed to help you understand fundamental concepts about subjects like learning, memory and competition in the shortest time possible. Each episode is available in video format on my YouTube channel and audio via my podcast. If you prefer to read, the transcript is below.

Want to know when new content shows up? Sign up for my newsletter here.

Transcript:

Let’s talk about photographic memory, a topic that I’ve been asked about more times than I can count.

First, we need to get one thing out of the way: photographic memory is a myth. That’s right, nobody has ever been able to prove that they have a photographic memory.

The reasons for this are related to what I talked about in Why Your Brain is Lazy, namely that your brain is forced into making trade-offs because it uses so much energy all the time.

When your brain encounters a stimulus in the world, it makes a decision to either keep it (what’s called “encoding”) or get rid of it. This is based on whether your brain sees the stimulus in question as salient.

If it is salient, then the encoding process will probably kick off, and if it’s just an everyday, run-of-the-mill stimulus, then it won’t.

Your brain will always make this trade-off, and no amount of training can circumvent such a fundamental biological principle. It’s an evolved survival strategy designed to reduce the amount of energy that gets wasted during the memory formation process, and there’s no way to escape it.

The question, then, is: why do so many people believe that photographic memory is real?

A simplified answer is that popular media loves to use it as a short-hand for high levels of intelligence and most people don’t question that stereotype.

Elon Musk is often used as an example of the hyper-genius who possesses a photographic memory, but as far as I know that claim about his memory has never been tested.

The more complete answer is that there are people who do exhibit extraordinary memory abilities, and those abilities get mis-classified as photographic memory.

There are some people who could be called savants who have exhibited world-class memory abilities. Kim Peek, the inspiration for Dustin Hoffman’s character in Rainman, could absorb incredible amounts of information in one sitting. Stephen Wiltshire is a savant who can recreate skylines in incredible detail after seeing them once.

These savants do have great memories, but there are two important qualifiers to consider: 1) their memories are never good enough to qualify as “photographic” (Stephen Wiltshire’s pictures contain many mistakes, for example), and 2) the memory abilities they possess appear to come at a huge cost, as they’re not able to take care of basic everyday tasks.

So their increased capacity for memory is a trade-off that doesn’t appear to be beneficial to their survival, which says a lot about how evolved the standard memory algorithm is.

Some people also exhibit what’s called hyperthymesia, or superior autobiographical memory. This group of people have an uncanny ability to remember the minute details of their day-to-day lives.

These individuals appear to have some kind of focused memory algorithm that doesn’t envelop their overall memory abilities. In other words, their brain prioritizes a specific type of information for encoding, but that benefit doesn’t overlap with any other facet of their memory. 

One last example is the group of people who compete at memory competitions. Memory competitors do things like memorize entire decks of cards within a few minutes.

This is all accomplished with the use of what are called mnemonics, which are memory tricks that can be used to memorize (for short periods of time and with lots of practice) specific bits of information.

Nobody with a claimed photographic memory has ever won a world memory championship, which is hilarious since you’d think that’s where they’d show up. If you had a photographic memory, why not cash in on it?

Anyway, the general idea to take out of all this is that photographic memory doesn’t exist. You can improve your memory in specific ways with specific techniques, but overall you can’t get away from the fact that your memory automatically tosses most of what it encounters.

Why Your Brain is Lazy

Ace Eddleman

This is part of my 5 Minute Concepts series, which is designed to help you understand fundamental concepts about subjects like learning, memory and competition in the shortest time possible. Each episode is available in video format on my YouTube channel and audio via my podcast. If you prefer to read, the transcript is below.

Want to know when new content shows up? Sign up for my newsletter here.

Transcript:

Let’s talk about why your brain is lazy.

The short explanation is it’s lazy because it has to conserve energy as much as possible.

Energy in turn needs to be conserved because, even though your brain only represents about 2% of your total body mass, it burns through about 20% of your daily calories.

In other words, your brain’s an energy hog.

The reason it’s a hog is because it runs everything in your body, and keeping the human body running is, to put it lightly, a complex task.

Why is it so complex? Well, most of what you do is unconscious. Conscious thought only represents a small portion of the work your brain is doing.

For example, you don’t run your nervous system or maintain your internal organs with conscious thought. That’s all happening in the background, and all of it takes energy.

Your brain is never really turned “off” as a result, even when you think it is — it’s always busy managing something in your body.

This is why the whole “you only use 10% of your brain” myth is such a joke. Your brain is a constant hive of neuronal activity at all times, and it’s because the brain is busy running the whole system. This is true even when you’re asleep — so even if you think you’re idle and nothing is happening, your brain is busy running everything.

Anyway, because it’s so busy allocating resources all over your body, your brain has developed a long list of cognitive shortcuts as a means of saving energy.

Forgetting is the best example of a cognitive shortcut: your brain forgets most of what it encounters because it would be too energy-intensive to remember tons of useless data.

The brain is thus optimized to be, to borrow someone else’s terminology, a “change detector.”

Your brain’s lazy memory algorithm focuses on encoding new memories that are salient, and dropping everything that isn’t memorable.

For example, you’ll remember if, during your daily drive to work, a zebra steps in front of your car even though you live in a major metropolitan area. That’s such an unusual event that it’s guaranteed you’ll remember it — it’s so salient that your brain will encode it as a memory.

This is why repetition is important in learning: you need to tell your brain (through repeated exposures) that a given stimulus is worth the energy expenditures involved in remembering it.

Your brain does this in order to take note of environmental cues that could potentially influence your survival. 

We suck up anything that’s unusual because big changes in our environment can be dangerous. A zebra running around in the street could be indicative of a problem at the local zoo, which in turn could mean there are dangerous animals running around that could eat you.

With all of these lazy shortcuts, your brain is making a trade-off of some kind. The brain is asking itself: “Is it worth pouring resources into this?” And if the answer is “no” then the brain doesn’t prioritize itself around that stimulus.

People get frustrated with these shortcuts, but they’re an indicator of a healthy brain. There are exceptions, like Alzheimer’s or CTE, but in general shortcuts like forgetting mean your brain is acting in accordance with its lazy nature.

The key idea to pull from all of this is that your brain does what it does for a good reason, and you won’t be able to learn or use your memory well if you don’t understand these internal dynamics. 

Much of learning revolves around finding ways to use your brain’s built-in mechanisms for your benefit, but there will always be limitations. Don’t get frustrated, just accept that your brain will never be perfect.

To put this all in perspective: supercomputers that take up entire floors of industrial buildings can’t touch the capabilities of the human brain. 

How is it that such vast amounts of electrical and computing power come up short when trying to handle tasks, like language, that we find trivial? This is one of the enduring mysteries of the brain. But it should clue you in to the fact that your brain, despite its laziness, is not as flawed as you might think.

  • Go to page 1
  • Go to page 2
  • Go to Next Page »

Copyright 52 Aces & Ace Eddleman © 2021 · Log in

  • Start Here
  • Books
  • Courses
  • Newsletter
  • Writing
  • Reading List
  • About
  • Contact