Artificial Intelligence, Arts, Intellectual Property, Publishing, Technology

Authors should be protected over big tech

COPYRIGHT LAWS AND AI

Intro: Creative artists and writers are voicing their anger at AI theft of their work with ‘Human Authored’ logos and an empty book. The government must listen

DURING last week’s London Book Fair, The Society of Authors stamped its books with “Human Authored” logos, in scenes that might have come from a dystopian novel. They described its labelling scheme as “an important sticking plaster to protect and promote human creativity in lieu of AI labelled content in the marketplace”.

Entrants to the fair were also given copies of Don’t Steal This Book, an anthology of some 10,000 writers including Nobel laureate Kazuo Ishiguro, Malorie Blackman, Jeanette Winterson, and Richard Osman. The pages of the book are completely blank, but the back cover states: “The UK government must not legalise book theft to benefit AI companies.” The message is clear and simple: writers have had enough.

The book fair arrived before the government is due to deliver its progress report on AI and copyright, after proposals for a relaxation of existing laws caused outrage last year. Philippa Gregory, the novelist, described the plans for an “opt-out” policy, which puts the onus on writers to refuse permission for their work to be trawled, as akin to putting a sign on your front door asking burglars to pass by.

According to a University of Cambridge study last autumn, almost 60% of published authors believe their work has been used to train large language models without consent or reimbursement. And nearly 40% said their income had already fallen as a result of generative AI or machine-made novels, a digital incarnation of Orwell’s Versificator in Nineteen Eighty-Four.

Factual books are clearly most susceptible to ChatGPT and other AI generative tools. While sales in fiction are rising, sales of nonfiction were down 6% last year compared with 2024. But three nonfiction books, all by female authors, bucked the trend: Nobody’s Girl, Virginia Giuffre’s posthumous memoir of abuse; A Hymn to Life, Gisèle Pel icot’s testimony and account of her ordeal at the hands of her ex-husband; and Careless People, Sarah Wynn-Williams’s exposé of working at Facebook. The success of these first-person narrations show the powerful reach of nonfiction beyond the world of publishing. These are painfully human stories; readers must be able to trust in the authenticity of their voices.

Last year, novelist Sarah Hall requested that her publisher Faber, print a “Human Written” stamp on her latest book, Helm. “AI might mimic the words more rapidly, but . . . it hasn’t bled on the page,” she said. “And it doesn’t have a family to support.”

Writers’ livelihoods must not be sacrificed to the promise of economic growth. The UK’s creative industries contributed £124bn to the UK economy in 2023, of which £11bn came from publishing. The Society of Authors is requesting consent and fair payment for use of work, and transparency as to how a book was “written”. These are hardly radical propositions. But in an era of fake news and AI slop, they are sadly necessary. Writers and creative artists need more than sticking plasters. They need robust legislation.

A House of Lords report recently published lays out two possible futures: one in which the UK “becomes a world-leading home for responsible, legalised artificial intelligence (AI) development” and another in which it continues “to drift towards tacit acceptance of large-scale, unlicensed use of creative content”. One scenario protects UK artists, the other benefits global tech companies. To avoid a world of empty content, the choice is clear.

Standard
Artificial Intelligence, Arts, Britain, Economic, Government, Intellectual Property, Legal, Society, Technology

Press freedom, copyright laws, and AI firms

BRITAIN

AMONG Britain’s greatest contributions to Western culture are press freedom and copyright law. Established side by side more than 300 years ago, they underpinned the Enlightenment, the Industrial Revolution, and much of the social change that followed.

They facilitated the free flow and exchange of ideas, opinions, literature and music, and offered legal safeguards for creators and publishers against having their work stolen or plagiarised.

Today, these sacred principles are at risk as never before.

In their headlong rush to develop all-embracing artificial intelligence systems, big-tech firms seem determined to ride roughshod over the intellectual property rights of those whose material they want to appropriate.

Musicians, authors, film and TV companies, artists and media organisations are already seeing their work lifted and used without permission. As the struggle for AI dominance intensifies, this larceny is becoming increasingly brazen.

Worse still, the UK Government appears to be taking the side of the tech giants over the creatives.

In a consultative document on possible changes to copyright law, it has proposed four options. Of these, its “preferred” option is to give a new exemption to AI firms, allowing them to develop their machine learning with copyrighted material without permission unless the holder actively opts out of the process.

Ministers have claimed such a change would give creators more control, but this is an illusion.

One of the strengths of British copyright is that it’s automatic. Works do not have to be registered to be protected from being stolen.

That means individual artists and the smallest local news sites have the same rights and protections as the largest publishers.

Permitting AI firms to take what they want unless rights have been reserved is like telling burglars they can walk into homes unless there is a note on the door asking them not to. In any case, there is no effective technical means of reserving rights and creatives will often be unaware their material has been “scraped”.

It would be far better to strengthen rather than weaken copyright legislation so it can be enforced quickly and effectively against infringements by AI developers. The onus should surely be on them not to break the law in the first place.

Everyone understands that AI is a vast and growing phenomenon which will be of enormous benefit in fields such as healthcare and business efficiency.

Many people will also appreciate the Government’s desire for Britain to be at the forefront of this technological revolution. But that cannot be used as cover to trample over crucial rights and freedoms.

Ingesting the entire output of the British music industry or mass-market news websites will not contribute anything to medical research.

Neither will it do much for our economy, as most of the profits generated by the tech companies will be taken out of the country.

It is both surprising and troubling that the Government has done no analysis of the economic impact of its proposal.

The UK has the world’s second largest creative sector, generating an estimated £126billion a year and supporting 2.4million jobs. Relaxing copyright law would cause it incalculable damage.

We also have vibrant, free and media pluralism – for now at least.

Our traditional press is in the process of rapid flux, as print gradually gives way to new digital platforms and revenue streams. But the fundamentals remain the same – to inform and entertain the public with fair, accurate, challenging and well-written journalism.

In this age of conspiracy, disinformation, and fake news, trusted sources of information and commentary are more important than ever. But it costs money to produce them, and if every article can immediately be copied without payment, then generating the revenue needed to sustain reliable journalism becomes impossible.

A free and independent media has long been a cornerstone of our democracy, but it is under very serious threat. We take it for granted at our peril.

Standard