Menu Menu
[gtranslate]

Authors protest AI at London Book Fair with ‘empty’ book

Over 10,000 writers have contributed to ‘Don’t Steal This Book,’ an empty publication that only contains a list of their names. Organised by Ed Newton-Rex, the book will be given out for free at the London Book Fair in protest of AI tools stealing the intellectual property of creatives. 

Thousands of writers have added their names to ‘Don’t Steal This Book,’ an ‘empty’ publication that has been printed to protest the unethical use of AI in creative spaces.

The book will be handed out to visitors at the London Book Fair for free this week. It comes only days before the government assesses the economic cost of new alterations to copyright law that may grant tech companies a ‘commercial research exception,’ among other potential changes. This would mean that tech firms would not need to ask permission to use creative works in their AI models.

On its back cover, the book reads: ‘The UK government must not legalise book theft to benefit AI companies.’ It is a direct response to these proposed law changes, as creatives grow increasingly worried about artistic ownership.

 

View this post on Instagram

 

A post shared by The Guardian US (@guardian_us)

‘Don’t Steal This Book’ was organised by Ed Newton-Rex, a composer who is campaigning to protect the copyright of artists and authors. He spoke to The Guardian about the framework underlying most AI generative tools, explaining that they’re ‘built on stolen work’ and lack any ‘permission or payment’ from the original, human creator.

‘It is not in any way unreasonable to expect AI companies to pay for the use of authors’ books,’ added Malorie Blackman, writer of Noughts and Crosses.

Tools like ChatGPT, MidJourney, and Sora create images and text via user-inputted prompts. These can be churned out instantaneously and have been widely adopted by many organisations over the past few years, including some of the biggest brands such as Coca-Cola and McDonald’s.

While they may be convenient for users and companies, the artwork or text that is created is inherently derivative and built on the originality of human artists. AI, by definition, can only ‘create’ based on a network of references and source material. 

The technology has expanded so rapidly over the past few years that copyright laws have not been well-established, leading to intellectual theft and lawsuits in both the UK and the US. The term ‘AI slop’ is often thrown around online, as users post automated images of famous intellectual properties all over social media.

There is concern about the future rights of creatives, as AI essentially gobbles up their ideas and replicates them without quality control or consent from those it is stealing from. The UK government hasn’t settled on any one system yet, leaving room for potential problems and copyright issues further down the line. 

What are these potential changes to copyright laws?

As previously mentioned, the UK government originally proposed a broad exception rule that would allow tech companies to use creative work for their AI models without any permissions or payments to artists. 

The backlash was, unsurprisingly, intense. Elton John famously called the government ‘absolute losers’ for suggesting a relaxation of copyright law that favours tech firms.

Now, a new license-based model has been put forward, theoretically giving more protection to creative industries. Artists and authors are pushing for a system that requires AI companies to have explicit permission for training data, rather than an ‘opt-out’ model that catches most creatives out.

Three basic approaches are being discussed: loose rules that would give AI companies free rein, an opt-out system that requires creatives to engage with copyright systems, and a licensing-first model. 

Talks later this month should – in theory – provide some clarity and confirmation as to which way the government will go. Parliament will provide an update by March 18th, a government spokesperson has said. 

Enjoyed this article? Click here to read more Gen Z culture stories.

Accessibility