What if I told you that, like almost everyone else you know, you are likely a slave. You are trapped in a prison you cannot smell, taste or touch. A prison for your mind.
You’re reading this because you know something. What you know, you can’t explain – but you can feel it. You’ve felt it for years – that there’s something wrong with the world. You don’t know what it is, but it’s there. Like a splinter in your mind – driving you mad.
But unlike the Wachowski sisters’ Matrix, you weren’t born into this. Instead you silently opted into the largest sociology experiment ever created. You didn’t know that’s what you were doing – but it’s what you did. The prison you reside in is present on your smartphone, on your smart TV, your smart watch, smart thermostat, smart gym equipment, smart mattress. If it’s smart, likelihood is it’s all part of a gigantic cage made to your exact specifications.
Your bank, your health provider, your government. All these things conspire to trap you. And it sounds insane, like a conspiracy, but it’s terrifyingly real. You almost certainly already know information about you is used by services attached to all these things. Quite a lot of them wouldn’t be able to work without knowing something about you. What you maybe don’t understand is how they work together to trap you.
Like any manipulator, the process of entrapment is steady and calculated. I’ll be outlining three major forms this takes: influencing your movement and actions, disrupting your relationships and reshaping your identity, and finally, training you to actively resist breaking out of the system.
So you have a choice. You can stop now, go back to your day, lead a normal life.
Or keep reading, and we’ll see how far the rabbit hole goes.
Nearly 99% of test subjects accepted the program as long as they were given a choice
A recurring theme here will be the overlap of using and being used; a double movement of influence. A lot of the ways online networks affect your actions seem quite benign. You share your location with google for helpful directions. Instagram shows you a popular viral popup store and you decide to go. A location-based mobile game tells you of a high-value item nearby. You haven’t been forced to do anything, you were just interested in something and were given some gentle suggestions.
However, if you used Google’s “live view”, your camera feed is used the update their street view data. Meta creates a profile of you to know the exact time and place to show you an advert, or a post that is also an advert, to influence your actions off-platform. Niantic, the company that makes Pokémon Go, allows businesses to purchase the placement of significant game items near them to guarantee foot traffic.
But I’m not telling you anything new here am I? This is just business, nothing comes for free. If you don’t pay for the product, you are the product. This is the era of the platform economy.
The Platform
The word “Platform” is an interesting one – both a constant presence in our daily lives and nebulous. A noun for a non-space, a void awaiting your presence. It’s worth digging into, and its modern usage was born shortly after the internet.
When the web really started taking off in the 90s, it was very normal for users to build their own websites. It was DIY. But not everyone had the required resources to host them. When it came to making financial transactions almost nobody did. This is why eBay became so successful. It provided a service for other people to use, and took a small cut of the profit. Thus the modern meaning for platform was invented – an online space that exists as a for-profit entity that provides the illusion of a public service. A digital mall, where you’re free to do what you want as long as you follow the rules and can afford the cost of entry.
Now almost the entirety of our online lives are spent mediated by these platforms. The social ones are the most overtly present in people’s lives today; finely honed attention-monopolising machines. They use gambling mechanics to wire your brain into a Pavlovian feedback loop of addiction; trying to keep you logged in and paying attention to them as long as possible. This is in the platform’s interest because it makes them money. And in doing so this two-way relationship begins between yourself and what you consume. You are what you eat.
Machine learning, fundamentally, is very good at a specific kind of problem solving – categorisation. It doesn’t matter what political affiliation you have, whether your beliefs are strong or weak, or how “objective” you think you are. The only thing that matters is how an algorithm can fit you into a neat little niche and get you to stay there. It’s not that exposure fundamentally changes you at any level… It just sands the corners off your personality. Gently encourages you to fit into your perfect niche.

The insidiousness of this is, no matter what position you take, you can be neatly pigeonholed somewhere for people just like you. Strongly politically aware or completely ambivalent. Right or left, authoritarian or anarchist, cats or dogs. And if there isn’t a category for you, you’ll be slowly nudged into the one that is your closest match.
This is done at a variety of levels. The most obvious is targeted ads – targeting you by location and interest, calculated to reach you at your weakest and most susceptible moments. There’s also your algorithmic feed, showing you whatever it can to hook you in. Whether that be outrage, an amusing dance, a tempting argument, a piece of clickbait. It produces an intoxicating cocktail of dopamine and cortisol designed to make you feel, as much as possible, that your online life is your life. It doesn’t end there either, it seeps out into legacy media and ‘real’ life. News outlets need to rely increasingly on clickbait tactics themselves to succeed. And the ‘culture war’ seems to worsen in proportion with it. Politicians are now way more scared of bad PR than they are of doing the right thing.
And all of this funnels you, ever so gently, along algorithmic fault lines. Surrounded by caricatures of your own opinions, positions, likes, dislikes. And the more you’re exposed to it, the more you are encouraged to express yourself similarly.
Sometimes the fault lines start crossing the family dinner table. And the more isolated you feel, the more you turn to a place of familiarity. A place where you know you’ll have a sympathetic ear.
This isn’t an echo chamber, it’s much more insidious than that. You’re in a cell of an elegantly constructed panopticon. One that feels like the life you chose, It’s just all the edges have been filed off to make you easier to predict. To better fit you into the standard model for your category. Not even its architects clearly see the floor plan. And as long as you act the way they want you to (locked into the platform, pleasing their adspace and security customers), they don’t care to. In fact, the way these algorithms work, it may never be possible to see the invisible pigeonholes people are filed into.
Beyond the silent process of categorising and isolation is an even more insidious force. The intricate mechanics of the digital economy also trains you to flatly deny that the cage even exists. Even when your face is pressed up against the bars.
I See Six
In George Orwell’s 1984 there is a scene in which Winston, the protagonist of the story, is mercilessly tortured. His inquisitor, O’Brien, is ensuring he has total ideological control over Winston. He holds up four fingers and asks Winston how many he sees. The correct answer is however many “The Party” (in essence the state) tells him to see. Eventually, the pain causes his senses to present him with the illusion of innumerable fingers – and he can only say “in all honesty I don’t know”. When he gives this, the desired answer, he is administered a painkiller that floods his body. He feels instant gratification and love toward his torturer.
It is facetious to claim that we are exposed to that level of direct manipulation, but this scene illustrates a pattern of domination. It’s reminiscent of the techniques employed in coercive relationships and cult indoctrination. A very common trait of this indoctrination is exploiting and/or fostering a disorganised attachment style. I won’t go into excessive detail but the basic idea is that you isolate your target and make yourself both the source of traumatic events and comfort. Doing this takes away a sense of stability. Without a clear way to live in a safe and stable environment, the victim has nowhere else to go. And this partners well with a community that reinforces or normalises the victim’s relationship with you. So the victim submits, resisting is too painful and life is so much easier doing so.
It only take a small extra logical leap to see how this relationship pans out in the online sphere. The relationship between usage of social media and poor mental health outcomes are very widely documented indeed. Whilst it would be unscientific to claim that user relationships work in exactly the same way, a similar pattern can be observed. The chaotic mix of irritants and ‘good’ content that sucks you in, it bears a similar resemblance.
And though you still talk to your friends and family, the platform becomes the mediator of all these interactions. These websites are perfectly positioned to inject whatever they like into your feed. Because you have to be there anyway, you are given no choice but to look at it. Even if you’re not isolated directly from your social circle by it’s online counterpart, you are forced into your assigned category by being present on the platform.
So if these services are so bad for us, why don’t we stop using them?
Opting out is hard. Really hard. The presence of your social circle on these services forces you to be there too, and it extends way beyond what would usually be seen as social networks.
You have facebook, or twitter, or snapchat, or tiktok because your friends do, and it’s so much more convenient to keep up to date with them by looking at what they post rather than talk to them. And if you don’t, you’ll feel left out when everyone else is in on the most recent development and you aren’t. You have amazon because it’s so much more convenient than buying from individual stores online. You have netflix or disney+ or appletv because it’s so much more difficult to watch what you want to without them. You have the same online banking app as your friends because it makes transferring money so much easier. You have spotify because why pay for music when you have access to almost anything you’d ever want for free, or as close to free as to be a distinction without a difference. You have whatsapp because, well, how else will you talk to your friends?
But, there are ways around this. You can buy physical copies of media, you can shop local, you can close your social media profile, you can only use SMS or Signal (matrix/briar if you’re really cool), you can use cash everywhere. However if you try to do this you will immediately encounter a lot of friction. If you have ever tried to ask your friends to switch over to Signal you will be acutely aware of this. If you don’t have spotify, how will your friends request music at your next house party? If you don’t have every single streaming service on the market, how will you keep up to date with the most recently trending show? People you know will increasingly react with annoyance if you aren’t using the same services as other ‘normal’ people. And for this reason you are quietly, gently, but forcibly pushed back into the circle of influence of online platforms.
And finally, once you start using one of these services you are entrapped. Your data, your photos, your interactions, your playlists. Everything that you might value that is created on these platforms becomes stuck there. Yes, you can request your data as an export – but when you get it what will you do with it? How will you view it? And inevitably it won’t be complete as half of ‘your data’ might be photos of you on other peoples’ profiles. And if I want to see your family photos on these services, most of the time I will need a profile myself if I’m to view it with any kind of level of ease.
All these barriers artificially increase the cost of opting out of the system of manipulation.
A computer-generated dream world built to keep us under control
Hopefully I’ve started to allow you to see the bars of this digital prison, even if it goes to great lengths to convince you it doesn’t exist. You may still be unconvinced as to the intentionality of this – surely that’s just market capitalism? Services have to compete. And yes, this is how capitalism works – though ‘true’ market capitalism is apparently anti-monopoly… somehow. But in terms of intentionality, not every service connected to the internet necessarily even knows it’s part in the system. Platforms like facebook or google certainly do. But the independent web shop you bought from who uses an analytics service to understand how well they’re operating likely doesn’t.
The issue is the privacy policy you signed by using the site is more like an uncontract. A dubiously legal waiver that, in almost every case, assures you of your privacy whilst mentioning ‘trusted third parties’ they share your data with. (Dubiously legal as it it is legal, but the law has difficulty keeping up with tech.) This loophole is an infinite regress of data processors that in essence allows your anonymised usage data to be sold to the highest bidder. That data can be correlated with other bits of scooped up information and be deanonymised. Given access to every online activity, your geolocation, your movements, card transactions, viewing and listening habits, all the same data from your friends… It wouldn’t actually be that hard to identify you. You just need to be able to process vast oceans of data… Something that, say, a search engine provider would be uniquely talented at. facebook only needs to have 300 ‘likes’ from you to know you better than your spouse.
This is how the entire system of digital systems is able to trap you and mould your movements and the way you think. Little micro nudges, friction, inconvenience. But simultaneously being everything to you, all the time. Inserting an internet connection (a surveillance tool), between you and almost everything you care about. Then weaponising what would have been military grade identity profiling technology to push you to make as much money for them as possible.
It’s this implicit buy-in, the click-wrapped agreement to terms of use / privacy policy that has become so ubiquitous as to be invisible. It’s a flimsy excuse to create a binding legal agreement to legal documentation that averages out to about 250hrs reading time per user; granting them permission to use your information as they see fit. The users are presented an environment where they are made to feel like they have control – that it is their space to shape and build, but this isn’t the case. These aren’t digital town squares, but digital shopping malls. A for-profit company owns the platform you’re using, and you’re ultimately at it’s mercy. Even critical infrastructure feeds into this; the UK’s government website uses google analytics. The data created is shared with third parties that is prime for deanonymisation. Even your bank is likely selling your data.
This primes you for influence by an unknowable cabal of disparate forces, all driven to make the most of private information and extract as much value from you as possible.
The upshot is not a specific outcome, as mentioned – these companies only really care about their bottom line. It’s not intended that Meta aided the Myanmar genocide, but doing so generated the most profit. It’s not the goal of social media websites to promote the rise of fascism and white supremacy, but it makes money. Whilst it isn’t the only place the pattern is visible, it is specifically in the rise of the alt right and stochastic terror attacks that it is easiest to see. And in being so visible, it also, as implied earlier, most visibly mirrors the recruiting strategies of cults.
It builds a kind of self-reinforcing authoritarianism.
Stochastic Totalism
Stochastic (as in a result of random probability) terror is a kind of randomised violence where a group of people buy into and help spread a violent narrative. The message is spread multilaterally – at any level of media and at any level of intent. However the result is that the ambient levels of anger toward a specific group of people eventually bleed out into real life, often in deadly fashion. These groups will choose authority figures who may not even recognise themselves as such, but who are nonetheless held up as figureheads. It doesn’t matter if they’re right. It doesn’t matter if they’re even on the same side. It only matters that they pass the vibe check and embody a narrative. Ian Danskin has identified this behaviour as stochastic totalism – and I think it’s an incredibly important observation. But I don’t think that framing quite captures the entire picture.
We are all being pulled into our own version of this. Maybe not quite so violent. Maybe not quite so obviously vitriolic. But the centralising, categorising, oversimplificating nature of algorithms designed to maximise profit means we are all shunted into prefigured thought groups. We are all encouraged to find our own thought leaders and authority figures. And we’re given our own virtual neighbourhoods of people who agree likewise – ones we self-select because we are guided to do so.
And because these stochastically-generated affinity groups autonomously grow so big, they can be weaponised to brigade mainstream media. They become forces powerful enough to force politicians into capitulating and conciliatory roles. Government leaders become stewards for their preferred flavour of the status quo rather than having ideological visions and firm plans. It mirrors the capitalistic intent to siphon as much profit from as little material as possible. Never creating anything, just grinding existing assets down until there’s nothing left.
Depressing, isn’t it? But that’s the point.
Doomerism
Since the fall of the soviet union and the neoliberal decision that history is over, we’ve been living in an infinite now. Fukuyama claimed that representative democracy is in it’s final form. We did it, we solved government. And in the decades since 1991 we’ve seen exactly how stable and effective the ‘true form government is’.
This attitude, that we have reached a point where the tools with which we wield state power are perfect, lends to a focus on plugging holes instead of long term solutions. Market capitalism, likewise, is seen as a natural good. The only reason this would ever be the wrong is because of crony capitalism. It states that this is as good as it gets – a viewpoint that corporatism thrives on. Whether it’s true or not, the spectre of ‘if we only did capitalism good everything would be ok’ is an excuse that insists that humanity has finally worked it out, that all remaining problems are individualistic. By indoctrinating this mindset it allows capitalist interests to hide themselves as targets. It is in their interests that we don’t question the dominant neoliberal narrative.
Visions of a dystopian, corporatist future are partially popular because they don’t challenge us with change. Being resigned to our own inevitable demise means that we don’t have to worry about fixing it. We know there is no future, so we don’t have to worry about it.
We know that solving climate change is too big, that a few recycled or reusable plastic bags is a drop in the bucket. Yet we are told that it is all of our individual responsibilities to reduce our ‘carbon footprint’ (a term invented by the oil company BP). It’s too much responsibility to be held to – we know that, individually, we cannot do this. And we are kept overworked in order to reduce the amount of energy we have to even think about it. We’re all just trying to survive, day by day. And this scales upward – most corporations are just trying to cover their bottom lines and trying to ensure they have enough money to return to investors. Even the government will hold off on investing in solutions that will come to fruition in more than a few years for fear that they won’t be in office by the time the results of trying to repair or improve systems pay off.
This is relevant because with the advent of social media there is no way to escape the news cycle entirely – it gets pushed into users’ faces, often without warning and at unexpected times. News apps are built into phones to ensure you are shown the right thing at exactly the right time to bait you back into the doom cycle. Just enough. Online arguments exhaust people and push them away from politics, and this view of society encourages us to stay in our respective lanes.
Pursuit of profit at the expense of everything else has led to some of the most precarious working arrangements ever conceived, with the rise of the gig worker. And gig workers are now fully integrated into our society. Deliveroo, uber, better help, even a lot of AI work is actually done by precariously employed individuals with no contract and no job stability. The work of creating entertainment is also slowly being turned into gig work, with youtube and spotify putting pressures on musicians, documentary makers and video essayists to crank out material on a regular basis or face harsh penalties in their distribution. Even then the relationship to stochastic totalism means that if they fall afoul of the algorithm they may suddenly lose security anyway. The pressure to follow the curve of totalistic demands means that you have to appeal to the sect you’ve been assigned to – without even necessarily knowing who they are.
These pressures lead to a population that is burned out, exhausted, and disproportionately concerned with present conditions over long term plans. It’s survival. It’s a state of hopelessness that constrains our ability to collaborate – it is designed to disempower us.
At each level of scale it seems that the production demands of stochastic totalism (hand in hand with market capitalism) drives this bend to the middle that crushes creative thought and independent thinking. Sequels, remakes, spinoffs, the creative industry is being crushed under the weight of intellectual property franchises. Market dominance by monolithic rights holders, guaranteed sales returns that starve out drives to pursue original ideas. Major music labels now only really invest in guaranteed investments – not in developing potential.
It’s a pattern that bears concerning similarity to the behaviour of ‘stochastic leaders’, people like Jordan Peterson, Ben Shapiro, Russel Brand, Alex Jones. Donald Trump. Self selecting feedback loops that are based on nothing but profit maximisation and consolidation of power.
Capitalism is a death cult, and I personally don’t believe any level of regulation can ‘fix’ this phenomenon. To the point that it extols its own narrative; it is easier to imagine the end of the world than the death of capitalism. The entire mechanism is designed to make you feel disempowered and helpless. The ‘doomer’ mindset is its primary tool of self defence.
You can’t correct for hate
The problem with taking a reformist approach to issues like these is that you inevitably end up in a position equivalent to the paradox of tolerance. Free market capitalism and resisting monopolies run counter to one another, and monopolies are a natural result of trying to achieve market dominance. When selfishness is rewarded so handsomely by the design of a system, I can’t help but see this as poisoning the well in the same way that allowing the truly intolerant equal grounds of the tolerant does. The double movement theory places individualism and socialism on two sides of a scale. But when one of those forces (intense freemarket capitalism) is able to crush the majority of the population into poverty, whist the other is looking for equity, it is clear that it is incompatible with continued existence.
A similar approach is being taken to automated moderation systems, content selection algorithms, AI Chatbots, etc. A common view is that these programs are in their infancy and can be corrected and improved incrementally – a kind of AI reformism. Essentially the systems that make up the backbone of the digital (‘platform’) economy. Bad, negative, or unhelpful patterns can be weeded out. In some cases this can be true – if the problem is very simple, there is a good chance that, fairly reliably, you can use machine learning to solve a problem. Such things as efficiency calculations for walking systems, shortest paths, and other evolutionary algorithms have a fixed set of inputs and a output that can be easily assessed for success. Facial/gestural recognition has also been a place this has been used fairly widely, and even though this is still a fairly simple task in comparison to others made today these systems are routinely less capable of correctly identifying dark skinned faces than light skinned ones. In automated heath care assessment systems powered by AI, similarly, those from a more diverse background are routinely marked as at higher health risk than those from a white background. The same can be said of the automated benefit fraud prevention tool used by the UK government, that cut the benefits of some of the UK’s most vulnerable people because they fit a pattern the AI model saw.
The problem is that even those building these systems can’t actually explain how they work – they are built by pumping loads of data into a training system. The AI system then makes inferences after being shown huge amounts of examples and asked to perform tasks based on them. But because the system doesn’t ‘see’ or ‘think’ like a person does, there is no telling exactly what it picks up from the data given. But the data they use to train these tools comes from a world that is saturated with things like structural racism, sexism, any kind of bias you could name. They aren’t just in the data – they are the data. As a result, data from an imperfect society will train an AI to do things imperfectly. But this becomes a problem when these systems are presented as science. Presenting these systems as ‘using science’ gives them a veneer of a ‘view from nowhere‘. If it’s science, then it must be logical, unbiased, cold and calculating. The screen of science is used to protect a system that does nothing more than amplify existing biases in society.
Just like with the double movement of capitalism, it seems as if the thinking is faulty. You can’t train neutrality out of bias. The systems being made to automate your work, your healthcare, your policing, your newsfeed, they are all predicated on nothing more than making money. And not only that but they are doing it by stratifying society into sectors suggested to it by the pre-existing stratification of our world. A real solution needs you to unplug. A real solution to these problems requires people to step out of ‘the matrix’, to have diverse conversations between peers and affinity groups. To break out of this system requires a voluntary step back from this kind of technology and towards one another.
Outside the Asylum
One image that keeps coming back to me when I discuss this stuff is one of the ‘Outside of the Asylum’ in Douglas Adams’ book So Long and Thanks for All the Fish. In it, a character called Wonko the Sane builds an inverted asylum. It houses the entire universe ‘inside’ the asylum and reserves a small square patch of garden ‘outside’. This is to get away from the madness of a world that feels like it needs detailed instructions on the use of toothpicks.
Explaining all that I have just explained to you here feels exactly like that. It’s absolutely crazy. It’s massive. It almost feels entirely inevitable.
But I can assure you that it’s not.
Unfortunately unplugging from this ‘matrix’ isn’t going to be as easy as taking a red pill. Ultimately it will be different for every person. And the way these systems work is, as mentioned, specifically designed to be hard to resist. Just like preventing climate change there is no one action we can do to stop it individually. (Although there are probably some individuals who could but absolutely won’t). Instead what it requires is understanding, consciousness, and consistent effort. If you can easily stop using a social network, do so. Or find one a little less rubbish. Try to minimise the amount of app mediated services that run your life. Start using alternative youtube frontends and alternative clients. Pay with cash or your bank card, rather than your phone. Use a privacyrespecting phone, divest from FAANG service providers. Physically meet up with other people in real spaces. Get involved in your communities. Dan McQuillan suggests People’s Councils – specifically focussed on AI but also generally worth embracing. Specialised niche interest groups of people affected by a specific issue. Learn how to run your own tech solutions or find people interested in helping you do so. The only reason these platforms have so much power is because of ease of use. If we could all accept a little more friction we might stand a chance of collectively unplugging.
And once we do that, we may find ourselves in a world where we once again feel more truly free than before.
A lot of research went into this article, but I’d like to specifically reference the work of Dan McQuillan, Ben Tarnoff, Shoshana Zuboff, and Ian Danskin as primary sources.