The Four Horsemen of Dark Design Patterns

A common refrain is used weekly in our home. It keeps my kids from roughhousing in my office, provides my wife and I with some degree of control, and importantly, stops our kids from playing with my power tools.

Tool, not toy.

It's absolutely natural that whenever our two kids (2 and 6) see my drills, saws, or other metal implements they are inexplicably drawn to them. Very much like moths to a flame, they want to "work like Dad". I love this, and am stressed out by it. One wrong swing of a hammer and we have a beautiful new memento of our children's curiosity, and we also have to patch that memento with drywall, spackling, and paint. From these equal parts of adoration and consternation was born the melody "tool, not toy".

There's no real structure around the idea. It's been said with enough seriousness (and followed by enough sticker-chart penalties) that when I hand a wrench to my kid and say "tool", it is treated with respect and used with care. It's carried deliberately, it's returned to its place once it's been used, and it isn't used to distribute concussions to other children.

Respect and care.

What are tools? Some are obvious: knives, drills, saws, the oven, the fridge, etc. Some need more convincing: computer peripherals (keyboard, mouse), TV remote, the TV itself, bookshelves, the printer, and yes, smartphones. With most tools the reason for their designation is apparent; some tools are easy to break, but most tools receive their label because improper use can cause serious damage to myself or others. Information technology as tool, however, requires a special motivation for designation.

With improper vigilance, I can be harmed by others through my technology.

Seems a little odd at first. After all, don't I have agency? Can't I just decide what my technological "tool" is used for? Ultimately, yes, however other individuals (most often trillion dollar tech companies) decide how our technology is used. By governing how we interact with our technology, app designers can subtly push us toward the behaviors that drive their business forward (e.g. make money) very often at our direct expense.

Why does that matter so much? Because companies like Instagram, YouTube, and others make more money the more time you spend with their software. They make this money because they are attention merchants, harvesting and selling our precious time to advertisers. The result? Massive increases in depression and anxiety. Harmful political polarization, the erosion of a shared reality.

All so they can show us another ad for some random shampoo we don't want.

That may seem more than a little melodramatic, I admit it, but it's true. Google and Meta are two of the most powerful and influential companies in the history of mankind, and they got there almost exclusively on billions of dollars in advertising revenue. But if they don't make the ads, what are they selling? Access to us. They sell what used to be our personal time, our moments of introspection, when we once contemplated meaning and import (or just had the freedom to zone out).

How do they get so much attention, reliably enough to build veritable empires? Farmers now use massive machines, tailored chemicals, and the occasional unethical practice to achieve 4000x increases in their yield from 100 years ago. The cultivators and harvestors of our attention also have exceptionally effective tools to induce compulsive (read addictive) use of their technology.

These tools are dark design patterns.

The Four Horsemen

Among all the diverse design choices developers have when building their applications, four in particular stand out as having a sinister synergy leading to compulsive use. These four are: the feed, a curation algorithm, advertising as revenue, and false (or empty) value.

The Feed

The source and origin of "doom scrolling", the feed is an exquisitely tuned tool to keep users engaged while subtly siphoning literal years from our lives. Aza Raskin, the inventor of the infinity scroll (where all "stopping cues" are removed and content constantly loads beyond where you are in the feed), estimates that this single invention steals more than 200,000 lifetimes of attention per day from humanity. Paired with the other four dark design patterns, the feed is a money making machine that almost all social media apps race to implement. If you see a previously useful or fun product add a feed, you know that "enshitification" has begun (I'm staring disappointedly at you Perplexity, ChatGPT, WhatsApp, and many more).

The Algorithm

Once you've removed stopping cues as an approach to keep individuals scrolling indefinitely, how do you decide what to fill that infinite feed with to keep your users engaged as long as possible? Enter the algorithm. Initially Facebook's feed was chronologically determined; you saw what your friends posted in the order it was posted. Eventually, when you started to see things you encountered the last time you were on the site, you knew you were "caught up". Now a billion dollar algorithm determines precisely what to show you next to keep you engaged for as long as possible. Where feeds were initially chronological, now (as Mark Zuckerberg recently admitted under testimony in court) more than 90% of all our feeds are algorithmically curated.

These algorithms elegantly solve two problems facing the attention merchants. First, with a chronological feed, once the new content is up, your user gets a significant cue that they should leave the platform. The "curation" algorithm solves this by drawing on the effectively infinite pool of user and creator generated content, enabling a never ending loop of dopamine doses at precisely the right moment to keep the user engaged.

Second, trained on data from billions of users, these algorithms can be exquisitely tuned to optimize for any number of metrics measured on website's backend. Should they optimize for, say, user enjoyment? Fulfillment? Happiness? No, none of those make that much money. They now optimize for engagement, meaning effectively time spent interacting with their content. Why? Because that makes the most money. Definitely not because it's what's best for humanity, or youth, or the developing adolescent brain. It is exclusively because engagement drives enrichment.

Add-driven Revenue

And where does all the enrichment come from? Advertising. The feed and the algorithm are akin to the massive farming equipment and pesticide chemicals enabling the massive increases in productivity of today's farms. The Ads are analogous to the international network of shipping and selling of the year's crop, enabling farmers in kansas to sell their corn and soybean across the entire world. Most importantly, however, the ads shape the incentive landscape beneath the attention merchants, effectively becoming their reason to exist in the first place.

If we were the customers of social media companies, their direct source of revenue and income, their products would cater to what we wanted to pay for. Instead, advertisers are the attention merchant's customers, we are their products, and the feed is their extraction tool. Advertising creates the attention merchant by producing an environment where attention trades for money. You can know, then, if a product contains external ads in any way, they are incentivised not to improve their product for your betterment, but to engage you and draw you in compulsively.

The False Value

In the movie Coraline, a young girl, who is dissatisfied with her unhappy family dynamic, discovers a passage to a strange "mirror world". This mirror world seems to have everything right; her parents are happy, and it seems that all her worries have gone away. Except, something isn't exactly right, her parents have buttons for eyes (a little strange), and the whole place seems off.

Eventually we find out that (spoiler) those parents in the mirror world are puppets, fake versions of her parents, manipulated by a creature that wants to trap Coraline there forever.

Welcome to social media.

Mark Zuckerberg has for years described Facebook as a product encouraging social connection. And while that may have been true in the first few years of it's existence, it is far closer to the mirror world in Coraline than it is to a social connection engine. On the surface the pieces are still there; friend "invites", direct messages, "following", etc. But for how much time we spend on these platforms, how much actual connecting is done?

Even if the algorithm happens to allow us to see a post from someone we know (90%+ of our feeds are algorithmically curated now), what happens when we "like" or react to it? That like gets pooled by Facebook's algorithms, which then decide the best time to present it as a notification to our friend. Yes, that's right, our friend isn't just "notified" right away. Our "interactions" with our friends and contacts have to go through a strange intermediary (the billion dollar algorithm soup) in order to be leveraged at the exact right moment to keep our connections as engaged with the platform as possible.

Similar to the button-eyed puppets in Coraline, if used through Facebook or other social media "cancers" (as put by Governor Cox from Utah) our relationships are weaponized against us to keep us "engaged" on the platform. And the more time we spend looking at a lame, glowing screen, the less time we spend with our friends and family in real life.

This is where the last of our dark designs comes into play, the false value. Social media draws us in with promises of "connection", or fueling our hobbies; adding value to our lives in some way. But in the end, that little value it may add (say, finding a cool recipe idea) is dwarfed by the amount of life we trade for it. Hours a day, just scrolling along.

Everytime we use social media, we should think of an anglerfish [insert picture from finding Nemo]. They draw us in with something interesting; flashy, even. But whatever it is, by design it can't actually be something satiating or useful to us. Because when we have what we came for, we would leave. Instead they draw us in with the illusion of value, just enough to keep us eternally interested, but never satisfied.

Our Life is Social Media's Competition

There we have it. The "Four Horsemen" of dark design patterns:

The Feed.

The Algorithm.

The Ads.

The Value.

Separate, these patterns are rather innocuous. Together, their sinister synergy results in compulsive use and harvested attention; net harm. If a company implements all four of these dark design patterns, they do not have our best interest at heart. They are a shameless attention merchant. They should be avoided.

Social media claims to provide us with connection, with friendship, with value. But in the end our family, friends, our hobbies, all are the direct competition of attention merchants. It's simple attention merchants math; the more time we spend talking directly to our family, spending time with our friends, or away from their site on our hobby, the less money they make. They don't want us to connect with anyone! Not unless we connect through their platform, to be exploited by their strange algorithms for their profit.

Ultimately, technology is a tool. The majority of technology is net good. Some technology implements dark design patterns to influence us in harmful ways. If we are aware and vigilant, watching out for what tools implement dark design patterns, we can use our technology with respect and care.

Tool, not toy.

Previous
Previous

The Data Gauge R&R

Next
Next

The Whole Message