Carole: Welcome to our beautiful country. I want to tell you a story about why I arrived here a bit late, and that was because I made the error of thinking I'm going to be decadent, I'm going to order an Uber. Now you can see where this is going. It eventually gave me one and then it said it was going to be 25 minutes and I was like “no I can't”, but then – I went to the page to cancel it but Uber have removed the cancel button from the main page! I thought, the irony of ordering an Uber to come to a Cory Doctorow event and then having my journey totally enshittified, so much that I almost miss it – that feels like a parable for our times…
Cory: First let me say: take an Uber if you need to take an Uber, right? Consumerism is not how you change society. It might make your life a little better at the margins, but you know, you're not going to shop your way out of a monopoly anyway. You're not going to recycle your way out of the climate emergency. Make the consumption choices that make sense for you, but if you find yourself spending more than 60 seconds on it, say: "I've done enough of this. I'm going to go work for an activist group or a political party or do something that actually has a chance of making a real difference and not just being a consumption choice." So take an Uber if you want to take an Uber. But you're right, that's a very shitty experience.
And what's interesting about it is, of course, Uber used to be pretty good, right? That's how they built up. It wasn't that Uber perfected a mind control ray that made us all decide that we would use them instead of minicabs or whatever. They were good. You went on your phone and you tapped a thing and you could see the car move and it arrived and it was cheap.
It was cheap because … they lost 40 cents on every dollar that they brought in. They provoked a lost decade in transit investment all over the world. They drove out all the livery cabs and minicabs. They prevented new market entry and where they couldn't they just bought those new market entrants in defiance of the black-letter of competition law because we don't enforce that any more.
And then once they had captured the market, they did what platforms do if they stop being helpers and start being gatekeepers. They squeeze the drivers, who are getting paid half as much. They double the fares, and then they stop caring about whether you had a horrible ride. They moved the cancel button.
It turns out that when you withdraw discipline from firms, when they don't face consequence for being bad to you, and when you reward them for doing bad things, bad things occur. This is a thing that is obvious to everyone except for the policymakers, who deliberately set out to take on board policies that they were warned at the time would lead to this, did it anyway, and created the enshittogenic policy environment that gave rise to the enshittocene.
So now we're in the final stage of enshittification. We load our feed, and what we see is the kind of homeopathic residue of the things we've asked to see.
Carole: Can you just give a capsule description of how you see the enshittification parabola?
Cory: So enshittification – it's got two parts right? One is a kind of observation about the course of platform decay and the other is a theory about why. And I've just expounded on that theory a little about this withdrawal of discipline. But the course of platform decay is very characteristic, and it is characteristic of what you would do if you were a middleman, an intermediary. It's not bad in and of itself – most of us can't do everything and even if we can we’ve got better things to do. We want to have people do bits of it for us. You know, maybe you are so Linux that you can gnaw a web server out of a whole log with your own teeth. But most of us don't want to do that, right? So you want to get this stuff from other people, your payment processing and so on.
So it's what happens when these helpers become gatekeepers, become so powerful that they usurp the relationship they're supposed to be facilitating. And it follows a characteristic pattern. I like to use Facebook to describe it partly because Mark Zuckerberg's a creep, but also because they're really patient zero and they're the perfect prototype case.
So at the start of enshittification, step one, a platform is good to end users and finds a way to lock them in. And with Facebook, they open up to the general public and they say: "Look, I know all of you have an account on MySpace, but I'm not sure if you know this. MySpace, it's owned by an evil, crapulent, senescent, immortal vampire Australian billionaire named Rupert Murdoch who spies on you with every hour that God sends. And nobody should use social media owned by a billionaire who spies on you, right? So come use Facebook. We'll never spy on you. We'll just show you the things you asked to see."
And so we pile in and then we lock in. And the way that we lock in on social media, it's different from other kinds of platforms. It's not like Uber cornering the market and then putting all the other cab companies out of business. On social media, we do it through something called the collective action problem. That's an economist term, but you know it as your six mates in your group chat are your best friends and you love them dearly, but they're a giant pain in the ass and you can't even agree on what board game you're going to play this weekend, much less when it's time to leave Facebook.
And you multiply that by the fact that one of them is there because that's where the people who have the same rare disease are. And another one is there because they immigrated and that's where the people they left behind hang out. And another one is there because that's where they meet their customers or their audience. And someone's there because, you know, their kid's in a football league and they organise the carpool with the other parents there. And so you don't just have to convince them to leave, they have to convince everyone else to leave. And so you get locked in. That's stage one. Good to the end users, lock them in.
Stage two, now that the users are locked in, you can turn the screws on them in order to tempt in business customers. So they go to the advertisers and they say: "Do you remember we told these numpties we weren't going to spy on them? Obviously a lie. We spy on them from asshole to appetite. Give us a remarkably small amount of money and we will target ads to them with exquisite fidelity. And because we are so dedicated to our craft, we filled a building full of engineers who are going to do nothing but fight ad fraud. You give us a pound to show a user an ad, we will stuff that ad in that user's face and you'll know it'll arrive, right?"
And then they go to the publishers and they say: "You remember we told them we'd only show them things they asked to see? Lying about that, obviously. Put excerpts from your website on Facebook. Add a link back to your own site and we'll just, like, cram it into the eyeballs of people who never asked to see it. And some of them will click your link and that's your free traffic funnel.”
So the advertisers, the publishers, they pile in and they get locked in, too. Because it turns out that it's much easier to lock in a supplier than it is to lock in a customer, right? If there's five coffee shops on your block and one of them goes out of business, well, that's sad, but you've got four other coffee shops. But if you're one of those coffee shops and the building next door is 80% of your business because it's an office block and they go out of business and your receipts drop by 80% overnight, that's it, right? You can't make the payment on the loans for the espresso machine. You can't make payroll, you better hope the landlord's going to give you rent relief.
So monopsony, which is the corollary of monopoly – it's the one we don't know about because no one ever made a board game that you're forced to play with your family until you want to kill them and yourself about it – monopsony is much more durable and easily attained than monopoly. So they lock in the business customers, too.
And then they turn the screws to them, right? It's not a conspiracy between Facebook and advertisers to screw us. It's a conspiracy among Facebook and its investors to screw everyone. And so ad targeting fidelity goes way down, ad prices go way up, ad fraud explodes. Procter and Gamble in 2017 zeroed out a $200m per year surveillance advertising budget. They call it “programmatic advertising”. Their sales dropped by 0%. Because to a first approximation, no one saw the ads.
Publishers, meanwhile, find that you’ve got to put not just an excerpt to reach even your own subscribers. You’ve got to put the whole article, and woe betide the publisher that adds a link, because that might be a malicious link and no one's going to see it. So it's not a traffic funnel anymore. It's a substitute for your own website and the only way you can monetise it is with that shitty ad market.
So now we're in the final stage of enshittification. This brittle equilibrium where all the value has been withdrawn except for the least that's needed to keep us locked to it. So we load our feed, and what we see when we go there is the kind of homeopathic residue of the things we've asked to see. And then the void is filled with stuff that people are paying to show us, but they're all being robbed for billions of dollars.
And then this is the equilibrium they're hoping for, right? All the value is transferred to shareholders and executives, but it's brittle because the difference between "I hate Facebook, but I can't seem to stop using it" and "I hate Facebook and I'm never going to use it again" – all it takes is like one giant privacy scandal and people bolt for the exits and then shareholders get worried that maybe this is the end of the ride. The share price takes a big dip and then the people who run the platform, they panic, but being technical people they have a technical term for this. They call it pivoting.
And so it is that one day Mark Zuckerberg arises from his sarcophagus and he says: "Hearken unto me, brothers and sisters, for I've had a vision. I know I told you that your future would consist of arguing with your most racist uncle using the primitive text interface that I invented in my dorm room so that I could non-consensually rate the fuckability of Harvard undergraduates. However, it turns out that the true future lies in me turning you and everyone you love into a legless, sexless, low-polygon, heavily surveilled cartoon character so that I can imprison you in a virtual world based on a 30-year-old satirical cyberpunk novel that I call the Metaverse."
And that's it. End stage enshittification. A pile of shit.
That's what a bubble is – stealing money from normal investors who don't want to freeze or starve to death when they're old.
Carole: You've come up with a very rational critique of what's happening but at the same time it's a big decision. A lot of your career has been around expounding the reasons why we need to break up these monopolies and the downstream effects if we don't. You're now living in America. You are witnessing in real time the Trump administration and also the alliance that we're seeing between him and between these tech companies. Do you have an emotional reaction to what is happening?
Cory: I mean, I thought I couldn't be any angrier. It turns out I was wrong. But also, you know, I'm an activist, which means that when I'm angry and frustrated I try to figure out what to do next. And I do think that Trump is weirdly building the case for what we should do next. He is the strange silver lining to his own dark cloud.
So for 40 years, the US trade representatives have made it their top priority to convince every policymaker in the world to adopt a law called “anti-circumvention” law. And this is a law that prohibits people from modifying the technology they use, even if it's in a device that they own themselves and even if it's for a lawful purpose. So you can't, for example, reverse engineer a phone to put privacy blockers in it. You can do it with a browser because there's no reverse engineering involved, but you can't do it with a phone because of anti-circumvention law.
It came into Europe through Article 6 of the Copyright Directive in 2001. Canada did it in 2012 with the Copyright Modernization Act. Everywhere around the world, we have a law like this. It protects the monopoly rents of tech firms. It's what stops you from adding privacy tools. It's what stops you from adding a third party app store. It's what stops you from plugging something into Amazon that, when you search for something, it tells you whether there's a local merchant that has it instead of Amazon. Right? That reverse engineering is per se unlawful because of these bad decisions we took.
So you might ask yourself, why would a country voluntarily sign up to put their own tech sector in chains and allow an American tech firm to extract these privacy and cash rents from their own population? Well, it's because the US trade representative said: "If you don't do this, we're going to hit you with tariffs."
Well, you know, happy Liberation Day, right? The case for tariffs is pretty much dead. If someone says "do as I tell you, or I'm going to burn your house down," and you do it and they burn your house down, you are just a sucker if you keep doing it. So there is a large constituency building and it can be anywhere in the world that eventually breaks and says we're going to raid the multibillion-dollar, centibillion-dollar margins that individual firms take.
You know, Apple alone takes $100bn a year out of the fees they take for app payments. Every time you give a pound to a media outlet for their news, 30p gets lodged in Cupertino, never to emerge except to briefly make an appearance somewhere over the Irish Sea so Apple can pretend that it's tax-free, right? The only thing that stops us from making a different app store – from like going into the big Tesco and there being a dongle in the checkout aisle that you plug into your phone and installs a different app store – is our decision not to allow circumvention.
The country that allows circumvention – they get to be like Finland and Nokia except they don't have to pay to make the phones. They just get to cream off the service revenue with none of the capital expenditure associated with phones. As Jeff Bezos said when he started the company: "Your margin is my opportunity."
So I look at this and I think there's probably some billionaire-on-trillionaire violence to be ginned up here, right? From people who want to, you know, invest in potentially very profitable businesses whose success factor isn't determined by how many TrumpCoin you buy. You need to jailbreak stuff. You need to make it legal to reverse engineer things.
So you combine national security hawks with privacy advocates, consumer advocates, labour advocates, with people who would like to do some industrial policy and liberate some consumer surplus. I think you get an unbeatable coalition. And we just need one country to break and then everyone else gets to buy their products, which I think is our way out of this.
Carole: So yesterday Peter Thiel sold all of his $600m stock of Nvidia, which is the chip manufacturer who's right there at the centre of the AI bubble, this company which is supposedly worth $3 trillion. Did you take that as a bit of a sign?
Cory: I think that's quite a sign. But then I want to talk about the material consequences beyond the economic ones. So, you know, AI can't do your job, but your boss is endlessly horny for firing you and replacing you with a chatbot, which means that when the salesman comes and tries to convince them to fire you and replace you with the AI, he will happily do so even though the AI can't do your job.
So we're taking a bunch of people who do useful things and we're replacing them with chatbots that can't do those things. And then asking the survivors of the great redundancies to be AI babysitters who try and nurse the whole thing along. Sometimes the AI isn't even an AI. I'm on a list from the, like, 1990s for Indian techies that I joined. I don't even remember how I got on it, but I've been on it for so long that I've got all these friends across the ocean. And their joke is that AI stands for either Absent Indians or ChatGPT stands for Gujarati People Typing because often AI is just people in low-waged environments pretending to be chatbots, pretending to be robots – or, you know, there's some automation but they're stepping in because the automation fails and fails and fails.
You may remember Amazon had these shops, the "just walk out" shops where you would walk in and fill your basket and just walk out and the AI cameras would do it. There wasn't – there weren't AI cameras. Each camera was being watched by three people in an Indian call centre trying to figure out among them which thing you just put in your basket. You know, “fake it until you make it” and all, but this is literally like three pretend AIs in a trenchcoat.
The only thing that is worse than firing people who do an important job and replacing them with a chatbot that doesn't do their job very well is then having all those chatbots switched off. And it's not like you can find those people again. They've become discouraged. They've retrained. They've moved to another industry. They've retired. And so now you just have nothing, right? So it's not just that we're going to have this AI apocalypse. It's that AI is like the asbestos we're shovelling into our walls, right? And our generations are going to be digging it out – or our descendants are going to be digging it out for generations.
Carole: Just to put it into layman's terms – this AI apocalypse which you're talking about, a financial meltdown, the collapse of the US economy, is that going to be bad?
Cory: Yeah, it's going to be very bad. It's going to be very, very bad. There is like a kind of silver lining-ish. So, you know, there's two kinds of bubbles, right? There's the bubble that just leaves nothing behind and there's the bubble that leaves behind a productive residue. All bubbles are bad, let's be clear, right? The way that you make a bubble is you get a bunch of insiders who gin up interest in a thing that isn't real or can't perform to the way that they claim it will. And then they lure in just normal investors who want to make sure that once they're pensioners, they don't freeze to death and they give them all of their money and then they steal the money. That's what a bubble is, right? We think of it as like a mania, foundationally. That's what a bubble is – stealing money from people who don't want to freeze or starve to death when they're old.
Some bubbles, though, leave behind salvage. So if you're of a certain age and you paid attention to American economics, you remember two companies that were at the centre of gigantic frauds. One was called Enron and it left behind nothing. They were pretend energy traders. The other was called WorldCom. They pretended that there was a lot of demand for fibre and they put a lot of fibre in the ground that no one wanted – and they stole a lot of money from normies and it was a fraud and, you know, that guy can roast in hell – but the fibre is still in the ground, right?
I live just outside of Los Angeles in a city called Burbank and I have 5Gb symmetrical fibre [broadband] from, of all companies, AT&T, and the way that I got it is they bought some old WorldCom fibre and turned it on. Right? So that asset is still there. It's a productive asset.
So let's look at the two bubbles that we're going through right now. We have cryptocurrency, right? And all cryptocurrency is going to leave behind is like shitty Austrian economics and worse monkey JPEGs. And then we have AI. And AI is going to leave behind some stuff, right? AI is going to leave behind a bunch of GPUs at 10p on the pound, right? It's going to leave behind a bunch of applied statisticians who are suddenly looking for work. And it's going to leave behind a bunch of open source models that these companies have made as loss leaders and calling cards that have barely been optimised and that still have so much room for improvement.
So it's already the case that these open source models can do a lot of impressive things. They're pretty cool and we'll probably make them do even more impressive things on commodity hardware at the margin. You know, if there wasn't a bubble around AI, right – if we were still where we were, like, 20 years ago with DeepMind – we would just call most of these things plugins and we'd go: "Oh, yeah, that's cool. That video editing plugin, that image editing plugin, that text generation or summary plugin, that transcription plugin, that's good. Doesn't work all the time. What plugin works every time, right? It's just fine."
It's only because they said: "Well, first of all, we need a trillion dollars for it. Second of all, we're not sure, but we think we might be making God, and if we're lucky, it won't turn us all into paperclips”, right? And this is a genuinely – speaking as a science fiction writer – a really stupid idea, right? The idea that if you keep teaching the word-guessing program more words, it will wake up. It's like the idea that if you keep breeding horses to run faster and faster, one of them will give birth to a locomotive.
Humans aren't word-guessing programs that know more words than the other word-guessing programs. So this ridiculous notion and all of the associated foofaraw has turned what would otherwise be a series of sensible utilities that we would make some progress with over the years to come into this thing that actually threatens the planet – not because we're making God but because we're destroying the economy.
Audience Member: Could you expand a little bit on the question of corporate capture of the state regulatory bodies?
Cory: So, you know, it can sound very abstract, regulatory capture. Really, it arises out of firms that become chummy and who are able to come to an accord about what they think the policy should be – especially when they have lots of money and they can turn their policy preferences into money.
In America, firms spy on us, right? And they spy on us in ways that are very bad for us. So if you're a nurse, preferentially, hospitals want to hire you as a contractor through an app because that way you're not in a union. It used to be through a staffing agency. Now there's four national apps. They all bill themselves as “Uber for nursing”. And because we don't have a privacy law … every other form of privacy violation is lawful.
So they can ask the unregulated data brokers that have these deep dossiers on every person how much credit card debt that nurse is carrying. And the more credit card debt they're holding, and the more overdue it is, the lower the wage they're offered. That's regulatory capture. People who make money by charging a desperation premium to nurses are among the people who help lobby to make sure we don't get a privacy law.
Here in Britain, we had an incredibly effective competition regulator. The Competition and Markets Authority came into its own over the last five years. They did more work than we've seen in maybe 40. It was an incredible moment. And so, in January, Keir Starmer fired the head of the agency, Marcus Bokkerink. Very effective fellow. And they replaced him with a guy called Doug Gurr. If that name rings a bell, it's because he used to run Amazon UK and he's now regulating Amazon UK.
If that sounds bad, though, Ireland is where all the tech companies pretend they're headquartered because it's a tax haven. And so they have this data commissioner. He's the first line of defence against privacy violations in Europe. If you want to bring a case for a GDPR violation, it starts in Dublin and it can spend eight or 10 years there before it finds its way back to the federal courts. And the data commissioner is in charge of that. He's been very ineffectual. It's about to get more ineffectual because they just brought in someone who used to be a senior Meta executive.
Audience member: I don't know if you've read David Chalmers, but I was wondering where you stand on the question of whether or not it's possible we're living in a simulation.
Cory: I don't think we're living in a simulation. Again, like this is the weird advantage of being a science fiction writer is you can tell science fiction from, you know, like a thought experiment that's quite fun or a thing that might be true. You know, William Gibson said cyberpunk was a warning, not a suggestion.
So yeah, I don't think we're living in a simulation. I don't think the word guessing program is going to become intelligent. Those things are super fun to write stories about. I have written stories about them. They're great, but they're not things that are going to happen. They're thought experiments, right? That's like, do you think we'll ever see the back of Plato's cave? You are misunderstanding the nature of Plato's cave!
This is an edited version of a conversation that took place on 18 November 2025 at the Frontline Club
