The Opt-Out Trap
On stealing politely, who bears the burden of being robbed, and the silent album nobody asked for
There is a particular form of institutional courage that consists of proposing something outrageous and then, when the outrage arrives on schedule, retreating while describing the retreat as listening. The UK government spent most of 2025 proposing that AI companies should be allowed to train their models on copyrighted creative works — songs, books, photographs, films — unless the rights holders actively opted out.
This is called, in the language of policy consultation, a 'text and data mining exception.' In the language of everyone affected by it, it is called plagiarism with paperwork.
On 18 March 2026, Technology Secretary Liz Kendall confirmed that the opt-out proposal is no longer the government's preferred approach. 'We have listened,' she said. This is the sentence governments say when they have been forced to stop doing the thing they wanted to do. It is not the same as having changed their minds. No preferred option now means the field remains open, which means the fight continues, which means the artists who spent 2025 making the case that their work is not a free training dataset for billion-dollar technology companies will need to keep making it.
I want to dwell on what the opt-out proposal actually was, because it was so elegantly designed to seem reasonable that the reasonableness itself was worth examining.
The opt-out system worked like this. AI companies could use any creative work to train their models. If you were a musician, songwriter, author, or photographer and you did not want your work used, you had to say so, in a machine-readable format, on each individual piece of work, in a way that AI crawlers could detect and honour. The burden of protection sat with the creator. The default was use. The default was always use.
Consider what this would have required in practice. There are approximately 170 million tracks on Spotify. Every one of them would need to be individually tagged, in a technically precise format, by rights holders with varying degrees of technical literacy, legal resources, and time. The independent artist with three albums and a day job would need to locate every uploaded version of their work across every platform and manually assert their right not to be scraped. The AI company would need to do nothing except wait, and use whatever wasn't explicitly protected, which would be most of it.
More than 1,000 musicians — Dua Lipa, Kate Bush, Elton John, Paul McCartney, Thom Yorke, ABBA's Björn Ulvaeus — released a silent album in protest.(1) The tracklist spelled out the message: 'The British government must not legalise music theft to benefit AI companies.' Of the more than 10,000 people who responded to the government's consultation, 3% backed the opt-out proposal. The remaining 97% were distributed across various positions, most of them considerably less accommodating.
What strikes me about the silent album is the formal precision of it. A thousand musicians, making silence, to protest the proposal that their music should function as raw material unless they spoke up to prevent it. The irony has the shape of a very good argument. You want us to opt out of being used without consent? Here is our opt-out. Here is what music sounds like when musicians decide to make none of it. I love it.
The philosophical problem with opt-out as a default is not complicated, though it was made to seem so. Copyright exists, in its original conception, as a right that attaches to a work at the moment of its creation. You make something and you own it. You decide how it is used. The opt-out system inverted this: someone else decided your work was available, and you could reclaim that decision if you were organised enough, resourced enough, and technically literate enough to execute the reclamation.
This is the structure of a burglary prevention policy that requires homeowners to apply for locks. The locks are available. Nobody is forcing you to leave your house open. If you object to being robbed, there is a form to fill out in triplicate.
The specific harm to independent artists was not incidental to this design. It was the design. Large rights holders — major labels, major publishers — have legal teams, metadata infrastructure, and technical staff capable of implementing a machine-readable opt-out at scale. An independent musician releasing music through a distributor while working two other jobs does not. The policy was described as protecting all creators equally. It would have protected the creators with the most resources to exercise the protection, and left everyone else in the training dataset.(2)
The broader context makes this sharper. The AI music generation systems that would benefit from the training data exception were themselves built on music made by human artists, without their consent and without compensation. This is already the situation. The music that trained Suno, Udio, and every other text-to-music model was taken. The opt-out proposal would have legalised the continued taking, going forward, unless each individual creator took specific technical steps to prevent it.
Australia, in the same period, declined to introduce an equivalent copyright exception and is exploring practical licensing frameworks instead. APRA AMCOS, which represents almost 400,000 songwriters, composers, and music publishers across Australia and New Zealand, and Canada's SOCAN issued a joint statement in March 2026 arguing that 'the framework we build now will determine whether cultural and economic value flows back to creators, or is concentrated among a handful of global tech platforms.'
The statement includes a dimension that the UK debate largely left out: Indigenous Cultural Intellectual Property. 'Ensuring the songs, stories, languages and knowledge of First Nations peoples are respected and not harvested by AI without consent' is not a marginal rider on the main argument.(3) It is a specific and documented problem: AI training datasets trained on publicly accessible audio include ceremonial recordings, language materials, and cultural content that carries obligations and restrictions the training process does not recognise and cannot honour. The opt-out model does not address this. Consent-based licensing, or the simple maintenance of existing copyright protections, does.
The UK government's reversal is a win. It is also the beginning of the next round, because 'no preferred option' means the decision has not been made, and the AI companies lobbying for the cheapest possible access to training data have not stopped lobbying. The creative industries have demonstrated that coordinated pressure can move a government. That demonstration will need to be repeated.
What is worth holding onto, in the space between the win and the next fight, is what the silent album said. A thousand musicians making silence is a thousand musicians asserting that music is made by people who choose to make it, under conditions they have some control over, for reasons that are theirs. The alternative — music as an undifferentiated resource to be extracted and processed by whoever has the processing infrastructure — is not an improvement on the current arrangement. It is the current arrangement, accelerated.
The opt-out proposal failed because enough people said so loudly enough that a government already under pressure from other directions decided the political cost of proceeding outweighed the political cost of retreat. That is how these things get stopped. Not through the rightness of the argument, though the argument was right. Through the sustained inconvenience of people who refused to accept that their work was available by default.
They were correct to refuse. The work is not available by default. It was made by specific people who made specific decisions, and it belongs to them until they decide otherwise.
The silent album made that point with the kind of formal elegance that good arguments sometimes achieve. A thousand musicians deciding not to make music, as an argument for their right to decide whether to make music. The AI companies wanted their output. Instead they got the gap where the output would have been.
It turns out the gap is also a form of music. It just wasn't the one anyone was trying to steal.
Notes
DJ Mag / RouteNote — UK government drops AI copyright opt-out plan, March 2026. https://routenote.com/blog/uk-government-drops-ai-copyright-opt-out-plan-after-backlash/
UK Music — What the government's proposed changes to AI copyright mean for the music industry. https://www.ukmusic.org/news/what-will-the-governments-proposed-changes-to-the-rules-on-copyright-and-artificial-intelligence-mean-for-the-uk-music-industry/
APRA AMCOS / SOCAN joint statement on AI copyright and Indigenous Cultural IP, March 2026. https://www.apraamcos.com.au/about-us/news-and-events/socan-joint-statement-ai