top of page

10,000 Authors Publish a Blank Book at London Book Fair 2026, A Powerful Protest Against AI Using Writers’ Work Without Permission

The relationship between artificial intelligence and human creativity has become one of the most contentious debates in the digital economy. As generative AI systems expand their capabilities by learning from vast datasets, writers, artists, and publishers are increasingly raising concerns about how their work is being used, often without permission or compensation.

At the London Book Fair 2026, this debate reached a symbolic turning point when nearly 10,000 authors collectively published an empty book titled Don’t Steal This Book. The book contains almost no text. Instead, it lists the names of thousands of contributing writers, representing a collective protest against artificial intelligence companies that allegedly use copyrighted material to train AI models without authorization.

The demonstration was both theatrical and deeply serious. Copies of the book were distributed at the London Book Fair as policymakers prepared to deliver a crucial assessment on potential changes to UK copyright law. The protest highlights a broader global conflict over intellectual property, innovation, and the future of creative industries in the age of machine learning.

The Empty Book That Sent a Powerful Message

The concept behind Don’t Steal This Book was intentionally minimalist. Rather than containing chapters, essays, or narratives, the book simply lists the names of thousands of writers who participated in the protest.

The symbolic message was clear: if creative work continues to be treated as freely available training data for artificial intelligence systems, the future of literature could resemble blank pages.

The protest was organized by Ed Newton-Rex, a composer and campaigner advocating for stronger protections for artists’ copyright. According to Newton-Rex, the initiative represents a plea from authors who believe that the generative AI industry has been built on creative works taken without consent or compensation.

The book’s back cover carries a direct warning aimed at policymakers:

“The UK government must not legalise book theft to benefit AI companies.”

This statement encapsulates the frustration felt by many creative professionals who believe current copyright frameworks have not kept pace with the rapid development of generative AI technologies.

Scale of the Protest: A Rare Unified Voice in Publishing

What makes the protest particularly significant is its scale. Approximately 10,000 authors contributed their names to the project, representing one of the largest coordinated actions by writers in recent years.

The campaign attracted participation from widely respected figures in literature and publishing, including:

Kazuo Ishiguro

Richard Osman

Philippa Gregory

Jeanette Winterson

Alan Moore

Marian Keyes

Mick Herron

Malorie Blackman

David Olusoga

These authors represent a broad cross section of the literary world, from bestselling novelists to historians and cultural commentators. Their participation underscores the depth of concern within the creative community about how artificial intelligence is being trained.

Malorie Blackman, author of Noughts and Crosses, articulated a widely shared sentiment among contributors:

“It is not unreasonable to expect AI companies to pay for the use of authors’ books.”

The protest signals a growing consensus among writers that the current model of AI training lacks transparency and fairness.

The Core Issue: AI Training and Copyrighted Works

Generative AI systems require enormous amounts of data to function effectively. Large language models and image generation systems are trained on massive collections of text, images, and other media drawn from across the internet.

These datasets frequently include copyrighted works such as:

Books

Articles

Academic research

Screenplays

Visual artwork

Photography

The inclusion of such material has sparked controversy because many creators argue that their work has been used without permission, licensing agreements, or compensation.

AI companies typically argue that training models on publicly available data constitutes fair use or is necessary for technological advancement. Critics, however, contend that this practice undermines the economic foundation of creative industries.

The issue has already triggered multiple lawsuits globally, highlighting the legal uncertainty surrounding AI training practices.

The Legal Battle Over AI and Copyright

The protest at the London Book Fair coincided with an important moment in UK policy development. The British government is currently reviewing potential changes to copyright law in response to the rise of artificial intelligence.

Several possible policy approaches are under consideration.

Proposed Policy Options
Policy Option	Description
Maintain current law	Continue existing copyright protections without changes
Licensing requirement	Require AI companies to obtain licenses before using copyrighted works
Opt-out system	Allow AI firms to use works unless creators explicitly opt out
Broad research exception	Permit AI use of copyrighted material for commercial research

Many writers strongly oppose the opt-out model, arguing that it shifts the burden onto creators to monitor and protect their work.

Critics warn that such a framework could effectively allow AI developers to access massive creative archives unless individual authors take proactive steps to block usage.

A Billion-Dollar Precedent in the AI Copyright Debate

The controversy over AI training data has already resulted in major legal settlements. One of the most prominent cases involved Anthropic, an AI company known for developing the Claude chatbot.

Authors accused the company of training its models using pirated copies of books. The dispute resulted in a $1.5 billion settlement, illustrating the financial stakes involved in copyright conflicts surrounding artificial intelligence.

This case demonstrates how the AI copyright debate is rapidly moving beyond theoretical discussions into real legal and financial consequences for technology companies.

The settlement also strengthened the argument of creative professionals who believe AI developers must compensate creators for the use of their work.

Why the London Book Fair Became the Stage for Protest

The London Book Fair is one of the most influential global gatherings in the publishing industry. Every year, publishers, authors, literary agents, and media professionals convene to discuss trends shaping the future of books.

By distributing the empty book at this event, the protest organizers ensured maximum visibility.

The timing was also strategic. The UK government is expected to deliver an economic impact assessment and policy update on copyright reform by March 18. This assessment will help determine how the government plans to regulate the intersection of AI technology and creative rights.

The protest therefore served both as a warning and a call to action aimed directly at policymakers.

Cultural Figures Enter the AI Debate

The controversy surrounding AI training practices has extended far beyond writers. Prominent artists across the creative industries have voiced concern about the potential erosion of intellectual property rights.

Musician Elton John, for example, criticized the possibility of relaxed copyright protections, describing the proposed policies in highly critical terms.

The debate reflects a broader tension between two powerful forces shaping the modern economy:

The rapid advancement of artificial intelligence

The protection of human creative labor

As generative AI becomes increasingly capable of producing text, music, images, and videos, the line between machine-generated content and human authorship is becoming increasingly blurred.

Economic Stakes for the Creative Industry

The publishing sector represents a major economic engine in many countries. Authors, editors, publishers, translators, and distributors collectively support a complex creative ecosystem.

If AI systems can generate content using existing works as training data without compensation, critics argue that several economic risks emerge:

Reduced income for writers

Declining demand for original creative work

Concentration of profits within technology companies

Erosion of copyright incentives

Some economists warn that if creative professions become financially unsustainable, the long-term cultural cost could be significant.

Ed Newton-Rex summarized the stakes in stark terms:

“This is not a victimless crime. Generative AI competes with the people whose work it is trained on.”

The Ethical Question Behind AI Innovation

Beyond economics and law, the dispute raises a deeper ethical question about the nature of creativity.

Artificial intelligence systems do not create knowledge independently. Instead, they learn patterns from existing works produced by humans. This means that the success of AI models depends heavily on access to creative content.

The ethical challenge lies in determining whether the creators of that content should receive recognition and compensation when their work contributes to training AI systems.

The answer to that question may define the future relationship between technology and human creativity.

Possible Solutions Emerging in the Industry

The publishing sector is exploring several potential solutions to address AI copyright concerns.

One proposal involves the creation of collective licensing frameworks, similar to the systems used in music royalties. Under such arrangements, AI developers could pay licensing fees to access large bodies of copyrighted material legally.

These fees could then be distributed to authors and publishers.

Other proposed solutions include:

Mandatory transparency about AI training datasets

Opt-in licensing systems for creators

Government-funded digital rights registries

AI training databases that exclude copyrighted material

Each approach involves complex trade-offs between innovation and intellectual property protection.

The Future of Creativity in the Age of Artificial Intelligence

The empty book protest at the London Book Fair is likely to become a defining moment in the global debate over AI and creative rights.

As artificial intelligence becomes increasingly capable of producing high-quality written content, policymakers must decide how to balance technological progress with the protection of human creativity.

Key questions remain unresolved:

Should AI companies pay for the data used to train their systems?

How can creators track the use of their work in machine learning models?

Can copyright law adapt to the realities of AI-generated content?

The answers will shape not only the future of publishing but also the broader digital economy.

Conclusion

The publication of Don’t Steal This Book represents far more than a publicity stunt. It reflects a growing global movement among creators who believe that the rapid expansion of artificial intelligence must be accompanied by stronger protections for intellectual property.

By presenting thousands of names on empty pages, the authors delivered a powerful visual message about the stakes involved in the AI copyright debate. Their protest highlights the tension between innovation and fairness, a tension that governments and industries around the world are now being forced to confront.

As policymakers evaluate the economic and legal implications of AI training practices, the voices of creators are becoming impossible to ignore.

For deeper insights into the evolving intersection of artificial intelligence, intellectual property, and emerging technology ecosystems, readers can explore research and analysis from Dr. Shahid Masood and the expert team at 1950.ai, where advanced studies examine how AI development will reshape industries, governance frameworks, and the global knowledge economy.

Further Reading / External References

Thousands of authors publish empty book in protest over AI using their work
https://www.theguardian.com/technology/2026/mar/10/thousands-authors-publish-empty-book-protest-ai-work-copyright

Thousands of writers published an empty book to stick it to Anthropic
https://lithub.com/thousands-of-writers-published-an-empty-book-to-stick-it-to-anthropic/

Why thousands of writers released blank books at London Book Fair 2026
https://www.firstpost.com/lifestyle/london-book-fair-empty-books-london-book-fair-protest-ai-why-authors-released-empty-books-13988357.html

The relationship between artificial intelligence and human creativity has become one of the most contentious debates in the digital economy. As generative AI systems expand their capabilities by learning from vast datasets, writers, artists, and publishers are increasingly raising concerns about how their work is being used, often without permission or compensation.


At the London Book Fair 2026, this debate reached a symbolic turning point when nearly 10,000 authors collectively published an empty book titled Don’t Steal This Book. The book contains almost no text. Instead, it lists the names of thousands of contributing writers, representing a collective protest against artificial intelligence companies that allegedly use copyrighted material to train AI models without authorization.


The demonstration was both theatrical and deeply serious. Copies of the book were distributed at the London Book Fair as policymakers prepared to deliver a crucial assessment on potential changes to UK copyright law. The protest highlights a broader global conflict over intellectual property, innovation, and the future of creative industries in the age of machine learning.


The Empty Book That Sent a Powerful Message

The concept behind Don’t Steal This Book was intentionally minimalist. Rather than containing chapters, essays, or narratives, the book simply lists the names of thousands of writers who participated in the protest.

The symbolic message was clear: if creative work continues to be treated as freely available training data for artificial intelligence systems, the future of literature could resemble blank pages.


The protest was organized by Ed Newton-Rex, a composer and campaigner advocating for stronger protections for artists’ copyright. According to Newton-Rex, the initiative represents a plea from authors who believe that the generative AI industry has been built on creative works taken without consent or compensation.

The book’s back cover carries a direct warning aimed at policymakers:

“The UK government must not legalise book theft to benefit AI companies.”

This statement encapsulates the frustration felt by many creative professionals who believe current copyright frameworks have not kept pace with the rapid development of generative AI technologies.


Scale of the Protest: A Rare Unified Voice in Publishing

What makes the protest particularly significant is its scale. Approximately 10,000 authors contributed their names to the project, representing one of the largest coordinated actions by writers in recent years.

The campaign attracted participation from widely respected figures in literature and publishing, including:

  • Kazuo Ishiguro

  • Richard Osman

  • Philippa Gregory

  • Jeanette Winterson

  • Alan Moore

  • Marian Keyes

  • Mick Herron

  • Malorie Blackman

  • David Olusoga

These authors represent a broad cross section of the literary world, from bestselling novelists to historians and cultural commentators. Their participation underscores the depth of concern within the creative community about how artificial intelligence is being trained.


Malorie Blackman, author of Noughts and Crosses, articulated a widely shared sentiment among contributors:

“It is not unreasonable to expect AI companies to pay for the use of authors’ books.”

The protest signals a growing consensus among writers that the current model of AI training lacks transparency and fairness.


The Core Issue: AI Training and Copyrighted Works

Generative AI systems require enormous amounts of data to function effectively. Large language models and image generation systems are trained on massive collections of text, images, and other media drawn from across the internet.

These datasets frequently include copyrighted works such as:

  • Books

  • Articles

  • Academic research

  • Screenplays

  • Visual artwork

  • Photography

The inclusion of such material has sparked controversy because many creators argue that their work has been used without permission, licensing agreements, or compensation.


AI companies typically argue that training models on publicly available data constitutes fair use or is necessary for technological advancement. Critics, however, contend that this practice undermines the economic foundation of creative industries.

The issue has already triggered multiple lawsuits globally, highlighting the legal uncertainty surrounding AI training practices.


The Legal Battle Over AI and Copyright

The protest at the London Book Fair coincided with an important moment in UK policy development. The British government is currently reviewing potential changes to copyright law in response to the rise of artificial intelligence.

Several possible policy approaches are under consideration.


Proposed Policy Options

Policy Option

Description

Maintain current law

Continue existing copyright protections without changes

Licensing requirement

Require AI companies to obtain licenses before using copyrighted works

Opt-out system

Allow AI firms to use works unless creators explicitly opt out

Broad research exception

Permit AI use of copyrighted material for commercial research

Many writers strongly oppose the opt-out model, arguing that it shifts the burden onto creators to monitor and protect their work.

Critics warn that such a framework could effectively allow AI developers to access massive creative archives unless individual authors take proactive steps to block usage.


A Billion-Dollar Precedent in the AI Copyright Debate

The controversy over AI training data has already resulted in major legal settlements. One of the most prominent cases involved Anthropic, an AI company known for developing the Claude chatbot.

Authors accused the company of training its models using pirated copies of books. The dispute resulted in a $1.5 billion settlement, illustrating the financial stakes involved in copyright conflicts surrounding artificial intelligence.


This case demonstrates how the AI copyright debate is rapidly moving beyond theoretical discussions into real legal and financial consequences for technology companies.

The settlement also strengthened the argument of creative professionals who believe AI developers must compensate creators for the use of their work.


Why the London Book Fair Became the Stage for Protest

The London Book Fair is one of the most influential global gatherings in the publishing industry. Every year, publishers, authors, literary agents, and media professionals convene to discuss trends shaping the future of books.

By distributing the empty book at this event, the protest organizers ensured maximum visibility.


The timing was also strategic. The UK government is expected to deliver an economic impact assessment and policy update on copyright reform by March 18. This assessment will help determine how the government plans to regulate the intersection of AI technology and creative rights.

The protest therefore served both as a warning and a call to action aimed directly at policymakers.


Cultural Figures Enter the AI Debate

The controversy surrounding AI training practices has extended far beyond writers. Prominent artists across the creative industries have voiced concern about the potential erosion of intellectual property rights.

Musician Elton John, for example, criticized the possibility of relaxed copyright protections, describing the proposed policies in highly critical terms.

The debate reflects a broader tension between two powerful forces shaping the modern economy:

  • The rapid advancement of artificial intelligence

  • The protection of human creative labor

As generative AI becomes increasingly capable of producing text, music, images, and videos, the line between machine-generated content and human authorship is becoming increasingly blurred.


Economic Stakes for the Creative Industry

The publishing sector represents a major economic engine in many countries. Authors, editors, publishers, translators, and distributors collectively support a complex creative ecosystem.

If AI systems can generate content using existing works as training data without compensation, critics argue that several economic risks emerge:

  • Reduced income for writers

  • Declining demand for original creative work

  • Concentration of profits within technology companies

  • Erosion of copyright incentives

Some economists warn that if creative professions become financially unsustainable, the long-term cultural cost could be significant.

Ed Newton-Rex summarized the stakes in stark terms:

“This is not a victimless crime. Generative AI competes with the people whose work it is trained on.”

The Ethical Question Behind AI Innovation

Beyond economics and law, the dispute raises a deeper ethical question about the nature of creativity.

Artificial intelligence systems do not create knowledge independently. Instead, they learn patterns from existing works produced by humans. This means that the success of AI models depends heavily on access to creative content.

The ethical challenge lies in determining whether the creators of that content should receive recognition and compensation when their work contributes to training AI systems.

The answer to that question may define the future relationship between technology and human creativity.


Possible Solutions Emerging in the Industry

The publishing sector is exploring several potential solutions to address AI copyright concerns.

One proposal involves the creation of collective licensing frameworks, similar to the systems used in music royalties. Under such arrangements, AI developers could pay licensing fees to access large bodies of copyrighted material legally.

These fees could then be distributed to authors and publishers.

Other proposed solutions include:

  • Mandatory transparency about AI training datasets

  • Opt-in licensing systems for creators

  • Government-funded digital rights registries

  • AI training databases that exclude copyrighted material

Each approach involves complex trade-offs between innovation and intellectual property protection.


The Future of Creativity in the Age of Artificial Intelligence

The empty book protest at the London Book Fair is likely to become a defining moment in the global debate over AI and creative rights.

As artificial intelligence becomes increasingly capable of producing high-quality written content, policymakers must decide how to balance technological progress with the protection of human creativity.

Key questions remain unresolved:

  • Should AI companies pay for the data used to train their systems?

  • How can creators track the use of their work in machine learning models?

  • Can copyright law adapt to the realities of AI-generated content?

The answers will shape not only the future of publishing but also the broader digital economy.


Conclusion

The publication of Don’t Steal This Book represents far more than a publicity stunt. It reflects a growing global movement among creators who believe that the rapid expansion of artificial intelligence must be accompanied by stronger protections for intellectual property.


By presenting thousands of names on empty pages, the authors delivered a powerful visual message about the stakes involved in the AI copyright debate. Their protest highlights the tension between innovation and fairness, a tension that governments and industries around the world are now being forced to confront.

As policymakers evaluate the economic and legal implications of AI training practices, the voices of creators are becoming impossible to ignore.


For deeper insights into the evolving intersection of artificial intelligence, intellectual property, and emerging technology ecosystems, readers can explore research and analysis from Dr. Shahid Masood and the expert team at 1950.ai, where advanced studies examine how AI development will reshape industries, governance frameworks, and the global knowledge economy.


Further Reading / External References

Thousands of authors publish empty book in protest over AI using their work: https://www.theguardian.com/technology/2026/mar/10/thousands-authors-publish-empty-book-protest-ai-work-copyright

Thousands of writers published an empty book to stick it to Anthropic: https://lithub.com/thousands-of-writers-published-an-empty-book-to-stick-it-to-anthropic/

Comments


bottom of page