• Skip to primary navigation
  • Skip to main content
52 Aces

52 Aces

Learning, competition and capitalism

  • Start Here
  • Books
  • Courses
  • Newsletter
  • Writing
  • Reading List
  • About
  • Contact

Technology

The Power of Sorting Algorithms

Ace Eddleman

This is part of The Algorithmic Society, an ongoing series about how algorithms and algorithmic thinking have taken over the world. Want to know when new content shows up? Sign up for my newsletter here.

We’re drawn to clean solutions, no matter what form they take. Complexity is frightening to us by default, and we crave results that we can forecast in all domains of life. Consistent patterns make us feel good, chaotic patterns make us run for the hills.

Pattern recognition is just that important to us. Patterns give us a sense of safety. Even if that pattern is bad (“Lions ate Fred when he walked near that tree, I should avoid that area”), it gives us a feeling of certainty that we find satisfying.

Algorithms free us from this fear of the unknown. They take us by the hand and whisper in our ear: “Don’t be afraid, I have the answer for you right here.”

This is more true today than it’s ever been. The algorithms we care about and interact with the most are designed to provide us with a safe haven in a disorderly world. They take the noise of a complex world and hand us something clean in return.

Consider sorting, one of the most common types of problems solved by algorithms.

If you crack open a textbook on algorithms, including the famous Introduction to Algorithms (aka “CLSR” after its authors), sorting is often the first major category to be explained. The reason is made clear in that aforementioned textbook:

Because many programs use it as an intermediate step, sorting is a fundamental operation in computer science. [emphasis mine]

Defining the problem of sorting algorithms is simple: there’s some set of “disordered” items, and it is the job of the algorithm to make them “ordered.” Where things get interesting is in the definition of the solution.

Sorting algorithm in action
Isn’t this satisfying to watch? Image source

This sounds trivial at first, and in many cases it is: there aren’t many implications to explore around how to sort a collection of files on your local hard drive, for example. There’s nothing political or philosophical to explore when you’re talking about text files arranged in alphabetical order.

But when you start to look at real-world sorting algorithms, a picture of how much of an impact it can have on the world we live in starts to emerge.

Facebook, Google, Amazon, and all the other major algorithm-based tech firms out there can be described in some sense as the great sorters of our time. Facebook sorts your social ties, Google sorts the web’s massive content database, Amazon sorts products, and so on.

When you login to Netflix, for example, you aren’t looking at the entire collection of films and shows they have available. Instead, the platform automatically sorts content based on what their algorithms think you’re most interested in.

This algorithmic work is performed in a way that saves humans time and energy, and in return someone gets charged for it. That “someone” is often not the front-end users of the algorithms, but instead those (such as advertisers) who wish to insert themselves into the sorted output set in some way.

Sorting Out Incentives

What are these algorithms sorting for? Again, in trivial examples like sorting files on your local hard drive, it’s not that complicated. You’re probably sorting in alphabetical order, or last modified date, or some other mundane sorting mechanism.

You could say this kind of sorting is utility-based. Making sure your files are in alphabetical order is done not because it accomplishes external objectives for a third-party, but because it makes your file system easier to navigate. It’s the computer doing what it does best: saving your precious brain cycles for something more interesting.

That’s not the case with a platform like Netflix. While it is possible for them to sort all of their content alphabetically and then present you with that whole library, it’s unlikely anyone wants that (aside from the most fringe film buffs). Their goal is to provide you with a small slice that appeals just to your tastes.

There is some straight-ahead utility in the type of sorting they provide, because they’re presenting content in a way that allows you to avoid the process of wading through that massive library on your own. But, in a theme that pops up over and over again with valuable algorithms, there’s a trade-off being made that most users aren’t aware of.

Every minute you watch Netflix, Netflix is watching you (and the same could be said for any major content sorting algorithm, such as YouTube). Each data point it collects allows you to refine its sorting algorithm just a little bit more.

This is done under the aegis of giving you the “most relevant content” (a theme you’ll hear about over and over again from algorithmic companies), which, as I stated before, is sort of true. You don’t want to have to sort through it all yourself, but you also have to wonder where the recommendations come from.

It turns out that what they’re really sorting for is time-on-platform (also known as “engagement”). In other words, they sort based on what will keep your eyeballs on the screen the longest. The algorithm is geared towards this and sorts based on what will keep your attention focused on the screen.

That could be almost anything — Ancient Aliens, anti-vaccination conspiracy theories, Steven Seagal movies — as long as it results in additional time-on-platform. To use fundamental language, the “problem” these sorting algorithms are “solving” is lack of engagement.

It’s at this point that we start to see how our love of pattern recognition can come back to bite us. We engage with the content that’s recommended to us, mostly because they fit with patterns we find appealing. The algorithms (which are patterns themselves, albeit of a different nature) then exploit our engagement of those patterns to shape new, more potent patterns for us to engage with.

There’s also some degree of sorting that’s required in order for search algorithms (another powerful class of algorithm) to be productive. If all you got was pages of fake Viagra ads because someone figured out how to game the search algorithm (which was a problem in the early days of search), then the engine would be useless.

But there are more third-party incentives built into search as well. When we search through Google, we aren’t searching through the whole of the Web. Instead, we’re searching through their index of the web (a sorted list) that they own and control.

What do they do with that index? They present search results that play most favorably to A) Google’s advertising system, and B) Google’s indexing system (also known as Search Engine Optimization, or SEO). Their users feed data into the system, and advertisers find ways to insert themselves into the sorted index.

Sorting Out Ownership

One could argue that some price should be paid for this search/filter/sort mechanism, but that’s not the point. What I want you to focus on is the fact that this algorithm is not some unbiased machine giving us the absolute best results — it gives us a “good enough” result that panders to the incentives of the algorithm’s owners.

This means that whoever owns the algorithm owns the sorting mechanism that their users plug into. By doing this, algorithms act as powerful leverage points for controlling perceptions.

The most famous example of this is how Facebook and Twitter have impacted political discussions. Because engagement is the defined solution for both platform’s sorting algorithms to solve, the sorted outputs create an engineered reality that optimizes towards that goal. Whether that goal is good for the user (or society) is not built into the specifications of the algorithm.

But the reach of sorting algorithms goes beyond the common, borderline cliche discussions of how social media has changed the political landscape. They’re now used not just for these platforms, but for more mundane (and in some ways, powerful) tasks.

Think about how sorting impacts how companies hire people, for example. The competition for jobs is fierce in almost every field, and human relations departments are overwhelmed by the sheer volume of CVs they receive for any given opening.

This is a serious problem: companies hire because they require labor to solve specific sets of problems, and the world is filled with people who need jobs. They need some way to shortcut the process of excluding and filtering people so that they can hire the “right” person — in other words, they need a sorting algorithm.

Once again, how the solution is defined matters. If a company wants to cut out a large number of candidates, regardless of actual skill, they can set a high bar with educational requirements.

Accomplishing this is now a matter of setting the parameters for their resume sorting algorithm. If a candidate does not mention any sort of university participation on the CV, they are sorted into the “round file.” Any sort of tangential value the candidate might bring to the table isn’t explored, because the algorithm’s specifications don’t include values like that.

There’s quite a bit of granularity that can be built into this. Maybe a degree isn’t enough, maybe the employer wants specific GPAs or post-graduate credentials. Again, minor tweaks to the algorithm and a large portion of the resumes being sent in gets thrown out.

What the employer is left with is a small, exclusive pool of candidates who fit exactly what they’re looking for. Whether their hastily thrown-together, biased assessments of what the “right fit” looks like is actually what they need is simply assumed. The algorithm can’t disagree, it just sorts.

This is how we’ve created a corporate landscape built out of yes-men (and yes-women, to be fair) who can’t think for themselves. Our internal algorithms about who the “right” people are tend to be defined by easy-to-spot, surface-level features like credentials and GPAs. Then we tell our resume sorting algorithms to optimize for those flawed perceptions and they manifest into real-world consequences.

It could likewise be argued that credit scores represent a sort of societal sorting algorithm. The problem to be solved is determining what level of creditworthiness an individual fits into, and the solution is a specific number.

Those with strong credit scores tend to get sorted into the most preferable buckets by banks, employers and a variety of other institutions. People with low credit scores get sorted into pathways that leave them with limited, frustrating options.

Once again, sorting is at work and the consequences of largely computerized algorithmic processes are very real. And, of course, how we define the solutions to these sorting problems is left unquestioned for the most part. It’s taken as a given that people without college degrees or with bad credit are less worthy of society’s benefits, therefore we let these algorithms “do their thing.”

Instinctual Sorting

There are some qualifications that need to be made on this subject. For one, sorting is a fundamental feature of our cognition. We need to sort our own individual environments, people and objects into categories like safe, dangerous, and so on.

For example, our brain has a built-in sorting mechanism that determines what goes into our memory and what gets forgotten. The signal used by this algorithm is salience, and anything that doesn’t hit a certain threshold of novelty is sorted into our cognitive garbage cans.

This inclination is a common thread towards how we build our societies. We sort people by class, income, political leanings, race, and a nearly infinite number of other feature sets.

On that basis it’s clear that the sorting algorithms themselves aren’t always the problem. They are often tools generated by our own inclination to order our environments, and they are in many cases useful. But, like a hammer that can either push nails or break skulls, algorithms can also be used as weapons.

It’s possible to create an algorithm with good intentions and end up creating serious problems. This is a strange sort of irony that’s created by algorithms and all the other manifestations of our need for order: in most cases, they create more disorder.

And, in line with our general misuse of algorithmic language, we don’t often talk about sorting as sorting. We use language that sweeps the downsides under the rug, such as “optimization” and “curation.” Words like “exclusion” don’t come up much, even if that’s what we get.

Sorting Out This Post

What can we learn from all of this? To my mind, the most important point to take away from all of this is that sorting perhaps the most important type of algorithm out there. There are many different, ever-evolving ways to sort, but the end goals tend to fall into a narrow range.

Sorting is becoming more powerful and more useful as time goes on. This is because the world is becoming more complex by the day, and sorting through the noise in novel ways — especially in ways that can make money — is valuable. Efficient sorting is, and will continue to be, worth billions of dollars.

From this point forward, whoever owns and controls the sorting algorithms will win. The world is being divided into the sorters and the sorted, and it’s becoming obvious that the sorted are losing ground in every walk of life.

Recognizing how sorting grants power to certain people and organizations is therefore a key skill in today’s algorithmic society. You should be capable of recognizing when sorting is occurring and work to understand the “who” and “why” behind it in a variety of contexts.

How to Talk About Algorithms

Ace Eddleman

This is part of The Algorithmic Society, an ongoing series about how algorithms and algorithmic thinking have taken over the world. Want to know when new content shows up? Sign up for my newsletter here.

This invasion of one’s mind by ready-made phrases can only be prevented if one is constantly on guard against them, and every such phrase anaesthetizes a portion of one’s brain. 

-George Orwell, Politics and the English Language

The term “algorithm” has become the latest linguistic tool for sounding sophisticated when talking about technology, a sort of TED Talk-esque shortcut to identifying with the Silicon Valley set. There’s something about the word itself that mystifies the average mind and imbues the user with an air of sophistication.

When someone at a cocktail party starts using the world “algorithm,” it becomes evident that this person is well-read and keeping up with the times. In other words, it’s become another way to signal status by leveraging (the appearance) of technical knowledge.

The word itself has become a stand-in for the god-like power that the largest internet platforms hold over our daily lives, a device for describing the black boxes behind behemoths like Google and Facebook. How we use the term “algorithm” hints at a sort of cowed awe at the sheer magnitude of their impact on the modern world.

We don’t know how they operate or who really pulls the levers (I imagine someone thinking are there levers on algorithms? as they read this) behind the curtain. Owners of algorithms are fine with this, because it makes their lives easier — they get to keep trade secrets to themselves, and they’re given a veneer of respectability in the process.

Ian Bogost described this dynamic best:

The next time you hear someone talking about algorithms, replace the term with ‘God’ and ask yourself if the meaning changes.

-Ian Bogost

Part of this has to do with the connection between algorithms and the gargantuan, public fortunes they’ve created in the era of high technology. There’s a new sort of American dream associated with algorithms, most famously captured in the movie The Social Network.

There’s a scene in that film where Eduardo Saverin and Mark Zuckerberg (played by Andrew Garfield and Jesse Eisenberg, respectively) are working on an algorithm by writing on a window.

Eduardo Saverin (Andrew Garfield) works on an algorithm with Mark Zuckerberg (Jesse Eisenberg)
Eduardo Saverin (Andrew Garfield) works on an algorithm with Mark Zuckerberg (Jesse Eisenberg)

While the algorithm in this scene is for a hacked-together pet project, the implication of the context is clear: algorithms with humble beginnings can conquer the world. Multiple billion-dollar fortunes were spawned by this primordial bit of mathematics that got translated into code.

Now algorithms are all over our cultural landscape, which is itself dominated by business narratives and the investors who drive them.

For example, it is now unimaginable to create a startup that didn’t offer some kind of algorithm that’s at least a minor improvement over another, existing algorithm. Not only is it not cool, but venture capitalists tend to shy away from such businesses because they lack scale. 

In short, algorithms are ever-present in our social media, on our phones, and in every domain imaginable that involves what could be considered “technology.” Algorithms are everywhere, algorithms are in everything, algorithms are everything.

Marc Andreesen famously said “software is eating the world,” but I would argue that the more correct phrasing is “algorithms have already eaten the world.”

Defining Algorithms

An algorithm in action
Image source

There’s an odd contradiction at the heart of our views on algorithms. On the one hand, we all seem to understand (to varying degrees) that algorithms play an outsized role in our lives. On the other, nobody seems to know what the word “algorithm” means.

What’s even weirder about this is that this definitional problem isn’t confined to the tech-illiterate: even computer scientists don’t have a generally-accepted definition of what an algorithm is.

It’s a long-standing debate, and some even say it’s not possible to sort algorithms from non-algorithms because of a computer science concept known as the halting problem.

This ambiguity can be viewed as a positive or a negative, depending on the algorithm and how you relate to it. But that’s all part of a larger exploration that I’ll get into later. For now, we need to come up with some kind of starting point for discussing algorithms that makes sense.

Since there isn’t a single, unified definition of an algorithm, we’ll have to use a simplified (and therefore flawed) one for now:

A set of well-defined steps for solving a specific class of problems.

What’s nice about this definition is that it gives us the ability to generalize the idea of an algorithm beyond its mathematical and computational origins. We can use it to describe any process that’s designed to operate in a repeatable, predictable manner.

We could say, for example, that low-tech activities like cooking involve the use of algorithms. After all, a recipe is a set of unambiguous steps (add ½ cup sugar, bake for 25 minutes, etc.) for “solving” specific food-related problems (how to convert ingredients into food, which in turn solves the problem of being hungry, etc.).

Bureaucratic procedures at a large company or government organization are also algorithmic if we run with this definition.

An employee can also be seen as a sort of algorithm as well — their whole job is to perform a specific set of tasks in order to accomplish goals for the organization. They do this by performing algorithms for each problem they’re presented with.

A Transformative Force

It’s also worth adding another dimension to this definition:

Algorithms transform some set of input values into a desired output or set of outputs.

This is key to understanding all thing algorithmic. Algorithms transform what they take in and generate something novel in the process.

In some sense, this is the most important way to look at algorithms. It makes you realize that there’s some objective involved, that what the algorithms are creating isn’t just math — they’re machines of creation.

Algorithms aren’t just transforming inputs from computer systems, either. As they integrate more and more with physical objects (including people), they are using the real world itself as a set of inputs and creating new landscapes in their wake.

Now we can start to unravel the linguistic consequences of using the word “algorithm” the way we do. By talking about algorithms as our digital overlords — never to be questioned or examined in a meaningful way — we hand them power they don’t deserve.

Even though they’re designed by high-caliber computer scientists, these algorithms are still designed and operated by humans. This means they’re flawed in countless ways, and even our most powerful computers can’t rid themselves of their designer’s human errors.

When you embrace that fact, it becomes clear that talking about algorithms with a glint of admiration in our eyes is often a mistake. While some are worthy of praise, quite a few are far more fragile, inaccurate and exploitable than their owners would like you to know.

More than anything, we need to get rid of this idea that algorithms are simply hand-waving mechanisms for explaining new technology. Algorithms are real, they serve specific purposes and it’s possible to at least begin to understand how they operate if you equip yourself to probe them.

Their impact, despite their flawed nature, is large. We’ve built, and continue to build, an algorithmic society, and it’s simply irresponsible to treat algorithms in such a haphazard way. It is the duty of every intelligent, capable adult in this new world to get a handle on what algorithms are and how they are shaping our world.

And this starts by learning to talk about algorithms not as magical code-driven dragons. It starts by seeing how they’re infiltrating not just our computers, but our very identities, our everyday existence.

They are fractal, spinning themselves through increasing levels of abstraction as they generate billions of dollars and shift our personal lives in ways that even their creators often don’t understand.

Algorithms are, in short, the most important topic of the modern era. It is my goal with this series to give you a glimpse into just how large of an impact they’re having, and then provide you with the tools you need to navigate this world more intelligently.

There will be more to learn, but consider this the starting point.

How to Survive Automation

Ace Eddleman

There has been quite a bit of concern about the rise of automation lately, and it’s been a key component of the rise of nationalist, protectionist politics. Jobs have been getting scarce in some parts of the world, and it’s important that people have the skills to recognize automation trends and adapt accordingly.

[Read more…] about How to Survive Automation

11 Ideas for Monetizing Quora

Ace Eddleman

Quora logo

I’ve been stuck in Iceland for the last 18 days, and as a result I’ve had a lot more free time on my hands than I normally do.

Today my sister’s husband, Chris, and I were bullshitting around and started talking about Quora.

[Read more…] about 11 Ideas for Monetizing Quora

The Value of Information

Ace Eddleman

Unless you’ve been living under a rock for the last 20 years, you’ve no doubt heard all about the benefits of living in “The Information Age.” It’s framed as a wondrous, liberating time to be alive, when there are no restrictions on what you can discover or create.

While I don’t think this is exactly wrong, I do think that there are some serious misconceptions about how much information itself is worth.

[Read more…] about The Value of Information
  • Go to page 1
  • Go to page 2
  • Go to Next Page »

Copyright 52 Aces & Ace Eddleman © 2021 · Log in

  • Start Here
  • Books
  • Courses
  • Newsletter
  • Writing
  • Reading List
  • About
  • Contact