This is part of The Algorithmic Society, an ongoing series about how algorithms and algorithmic thinking have taken over the world. Want to know when new content shows up? Sign up for my newsletter here.
We’re drawn to clean solutions, no matter what form they take. Complexity is frightening to us by default, and we crave results that we can forecast in all domains of life. Consistent patterns make us feel good, chaotic patterns make us run for the hills.
Pattern recognition is just that important to us. Patterns give us a sense of safety. Even if that pattern is bad (“Lions ate Fred when he walked near that tree, I should avoid that area”), it gives us a feeling of certainty that we find satisfying.
Algorithms free us from this fear of the unknown. They take us by the hand and whisper in our ear: “Don’t be afraid, I have the answer for you right here.”
This is more true today than it’s ever been. The algorithms we care about and interact with the most are designed to provide us with a safe haven in a disorderly world. They take the noise of a complex world and hand us something clean in return.
Consider sorting, one of the most common types of problems solved by algorithms.
If you crack open a textbook on algorithms, including the famous Introduction to Algorithms (aka “CLSR” after its authors), sorting is often the first major category to be explained. The reason is made clear in that aforementioned textbook:
Because many programs use it as an intermediate step, sorting is a fundamental operation in computer science. [emphasis mine]
Defining the problem of sorting algorithms is simple: there’s some set of “disordered” items, and it is the job of the algorithm to make them “ordered.” Where things get interesting is in the definition of the solution.

This sounds trivial at first, and in many cases it is: there aren’t many implications to explore around how to sort a collection of files on your local hard drive, for example. There’s nothing political or philosophical to explore when you’re talking about text files arranged in alphabetical order.
But when you start to look at real-world sorting algorithms, a picture of how much of an impact it can have on the world we live in starts to emerge.
Facebook, Google, Amazon, and all the other major algorithm-based tech firms out there can be described in some sense as the great sorters of our time. Facebook sorts your social ties, Google sorts the web’s massive content database, Amazon sorts products, and so on.
When you login to Netflix, for example, you aren’t looking at the entire collection of films and shows they have available. Instead, the platform automatically sorts content based on what their algorithms think you’re most interested in.
This algorithmic work is performed in a way that saves humans time and energy, and in return someone gets charged for it. That “someone” is often not the front-end users of the algorithms, but instead those (such as advertisers) who wish to insert themselves into the sorted output set in some way.
Sorting Out Incentives
What are these algorithms sorting for? Again, in trivial examples like sorting files on your local hard drive, it’s not that complicated. You’re probably sorting in alphabetical order, or last modified date, or some other mundane sorting mechanism.
You could say this kind of sorting is utility-based. Making sure your files are in alphabetical order is done not because it accomplishes external objectives for a third-party, but because it makes your file system easier to navigate. It’s the computer doing what it does best: saving your precious brain cycles for something more interesting.
That’s not the case with a platform like Netflix. While it is possible for them to sort all of their content alphabetically and then present you with that whole library, it’s unlikely anyone wants that (aside from the most fringe film buffs). Their goal is to provide you with a small slice that appeals just to your tastes.
There is some straight-ahead utility in the type of sorting they provide, because they’re presenting content in a way that allows you to avoid the process of wading through that massive library on your own. But, in a theme that pops up over and over again with valuable algorithms, there’s a trade-off being made that most users aren’t aware of.
Every minute you watch Netflix, Netflix is watching you (and the same could be said for any major content sorting algorithm, such as YouTube). Each data point it collects allows you to refine its sorting algorithm just a little bit more.
This is done under the aegis of giving you the “most relevant content” (a theme you’ll hear about over and over again from algorithmic companies), which, as I stated before, is sort of true. You don’t want to have to sort through it all yourself, but you also have to wonder where the recommendations come from.
It turns out that what they’re really sorting for is time-on-platform (also known as “engagement”). In other words, they sort based on what will keep your eyeballs on the screen the longest. The algorithm is geared towards this and sorts based on what will keep your attention focused on the screen.
That could be almost anything — Ancient Aliens, anti-vaccination conspiracy theories, Steven Seagal movies — as long as it results in additional time-on-platform. To use fundamental language, the “problem” these sorting algorithms are “solving” is lack of engagement.
It’s at this point that we start to see how our love of pattern recognition can come back to bite us. We engage with the content that’s recommended to us, mostly because they fit with patterns we find appealing. The algorithms (which are patterns themselves, albeit of a different nature) then exploit our engagement of those patterns to shape new, more potent patterns for us to engage with.
There’s also some degree of sorting that’s required in order for search algorithms (another powerful class of algorithm) to be productive. If all you got was pages of fake Viagra ads because someone figured out how to game the search algorithm (which was a problem in the early days of search), then the engine would be useless.
But there are more third-party incentives built into search as well. When we search through Google, we aren’t searching through the whole of the Web. Instead, we’re searching through their index of the web (a sorted list) that they own and control.
What do they do with that index? They present search results that play most favorably to A) Google’s advertising system, and B) Google’s indexing system (also known as Search Engine Optimization, or SEO). Their users feed data into the system, and advertisers find ways to insert themselves into the sorted index.
Sorting Out Ownership
One could argue that some price should be paid for this search/filter/sort mechanism, but that’s not the point. What I want you to focus on is the fact that this algorithm is not some unbiased machine giving us the absolute best results — it gives us a “good enough” result that panders to the incentives of the algorithm’s owners.
This means that whoever owns the algorithm owns the sorting mechanism that their users plug into. By doing this, algorithms act as powerful leverage points for controlling perceptions.
The most famous example of this is how Facebook and Twitter have impacted political discussions. Because engagement is the defined solution for both platform’s sorting algorithms to solve, the sorted outputs create an engineered reality that optimizes towards that goal. Whether that goal is good for the user (or society) is not built into the specifications of the algorithm.
But the reach of sorting algorithms goes beyond the common, borderline cliche discussions of how social media has changed the political landscape. They’re now used not just for these platforms, but for more mundane (and in some ways, powerful) tasks.
Think about how sorting impacts how companies hire people, for example. The competition for jobs is fierce in almost every field, and human relations departments are overwhelmed by the sheer volume of CVs they receive for any given opening.
This is a serious problem: companies hire because they require labor to solve specific sets of problems, and the world is filled with people who need jobs. They need some way to shortcut the process of excluding and filtering people so that they can hire the “right” person — in other words, they need a sorting algorithm.
Once again, how the solution is defined matters. If a company wants to cut out a large number of candidates, regardless of actual skill, they can set a high bar with educational requirements.
Accomplishing this is now a matter of setting the parameters for their resume sorting algorithm. If a candidate does not mention any sort of university participation on the CV, they are sorted into the “round file.” Any sort of tangential value the candidate might bring to the table isn’t explored, because the algorithm’s specifications don’t include values like that.
There’s quite a bit of granularity that can be built into this. Maybe a degree isn’t enough, maybe the employer wants specific GPAs or post-graduate credentials. Again, minor tweaks to the algorithm and a large portion of the resumes being sent in gets thrown out.
What the employer is left with is a small, exclusive pool of candidates who fit exactly what they’re looking for. Whether their hastily thrown-together, biased assessments of what the “right fit” looks like is actually what they need is simply assumed. The algorithm can’t disagree, it just sorts.
This is how we’ve created a corporate landscape built out of yes-men (and yes-women, to be fair) who can’t think for themselves. Our internal algorithms about who the “right” people are tend to be defined by easy-to-spot, surface-level features like credentials and GPAs. Then we tell our resume sorting algorithms to optimize for those flawed perceptions and they manifest into real-world consequences.
It could likewise be argued that credit scores represent a sort of societal sorting algorithm. The problem to be solved is determining what level of creditworthiness an individual fits into, and the solution is a specific number.
Those with strong credit scores tend to get sorted into the most preferable buckets by banks, employers and a variety of other institutions. People with low credit scores get sorted into pathways that leave them with limited, frustrating options.
Once again, sorting is at work and the consequences of largely computerized algorithmic processes are very real. And, of course, how we define the solutions to these sorting problems is left unquestioned for the most part. It’s taken as a given that people without college degrees or with bad credit are less worthy of society’s benefits, therefore we let these algorithms “do their thing.”
Instinctual Sorting
There are some qualifications that need to be made on this subject. For one, sorting is a fundamental feature of our cognition. We need to sort our own individual environments, people and objects into categories like safe, dangerous, and so on.
For example, our brain has a built-in sorting mechanism that determines what goes into our memory and what gets forgotten. The signal used by this algorithm is salience, and anything that doesn’t hit a certain threshold of novelty is sorted into our cognitive garbage cans.
This inclination is a common thread towards how we build our societies. We sort people by class, income, political leanings, race, and a nearly infinite number of other feature sets.
On that basis it’s clear that the sorting algorithms themselves aren’t always the problem. They are often tools generated by our own inclination to order our environments, and they are in many cases useful. But, like a hammer that can either push nails or break skulls, algorithms can also be used as weapons.
It’s possible to create an algorithm with good intentions and end up creating serious problems. This is a strange sort of irony that’s created by algorithms and all the other manifestations of our need for order: in most cases, they create more disorder.
And, in line with our general misuse of algorithmic language, we don’t often talk about sorting as sorting. We use language that sweeps the downsides under the rug, such as “optimization” and “curation.” Words like “exclusion” don’t come up much, even if that’s what we get.
Sorting Out This Post
What can we learn from all of this? To my mind, the most important point to take away from all of this is that sorting perhaps the most important type of algorithm out there. There are many different, ever-evolving ways to sort, but the end goals tend to fall into a narrow range.
Sorting is becoming more powerful and more useful as time goes on. This is because the world is becoming more complex by the day, and sorting through the noise in novel ways — especially in ways that can make money — is valuable. Efficient sorting is, and will continue to be, worth billions of dollars.
From this point forward, whoever owns and controls the sorting algorithms will win. The world is being divided into the sorters and the sorted, and it’s becoming obvious that the sorted are losing ground in every walk of life.
Recognizing how sorting grants power to certain people and organizations is therefore a key skill in today’s algorithmic society. You should be capable of recognizing when sorting is occurring and work to understand the “who” and “why” behind it in a variety of contexts.