Artificial Intelligence, Arts, Intellectual Property, Publishing, Technology

Authors should be protected over big tech

COPYRIGHT LAWS AND AI

Intro: Creative artists and writers are voicing their anger at AI theft of their work with ‘Human Authored’ logos and an empty book. The government must listen

DURING last week’s London Book Fair, The Society of Authors stamped its books with “Human Authored” logos, in scenes that might have come from a dystopian novel. They described its labelling scheme as “an important sticking plaster to protect and promote human creativity in lieu of AI labelled content in the marketplace”.

Entrants to the fair were also given copies of Don’t Steal This Book, an anthology of some 10,000 writers including Nobel laureate Kazuo Ishiguro, Malorie Blackman, Jeanette Winterson, and Richard Osman. The pages of the book are completely blank, but the back cover states: “The UK government must not legalise book theft to benefit AI companies.” The message is clear and simple: writers have had enough.

The book fair arrived before the government is due to deliver its progress report on AI and copyright, after proposals for a relaxation of existing laws caused outrage last year. Philippa Gregory, the novelist, described the plans for an “opt-out” policy, which puts the onus on writers to refuse permission for their work to be trawled, as akin to putting a sign on your front door asking burglars to pass by.

According to a University of Cambridge study last autumn, almost 60% of published authors believe their work has been used to train large language models without consent or reimbursement. And nearly 40% said their income had already fallen as a result of generative AI or machine-made novels, a digital incarnation of Orwell’s Versificator in Nineteen Eighty-Four.

Factual books are clearly most susceptible to ChatGPT and other AI generative tools. While sales in fiction are rising, sales of nonfiction were down 6% last year compared with 2024. But three nonfiction books, all by female authors, bucked the trend: Nobody’s Girl, Virginia Giuffre’s posthumous memoir of abuse; A Hymn to Life, Gisèle Pel icot’s testimony and account of her ordeal at the hands of her ex-husband; and Careless People, Sarah Wynn-Williams’s exposé of working at Facebook. The success of these first-person narrations show the powerful reach of nonfiction beyond the world of publishing. These are painfully human stories; readers must be able to trust in the authenticity of their voices.

Last year, novelist Sarah Hall requested that her publisher Faber, print a “Human Written” stamp on her latest book, Helm. “AI might mimic the words more rapidly, but . . . it hasn’t bled on the page,” she said. “And it doesn’t have a family to support.”

Writers’ livelihoods must not be sacrificed to the promise of economic growth. The UK’s creative industries contributed £124bn to the UK economy in 2023, of which £11bn came from publishing. The Society of Authors is requesting consent and fair payment for use of work, and transparency as to how a book was “written”. These are hardly radical propositions. But in an era of fake news and AI slop, they are sadly necessary. Writers and creative artists need more than sticking plasters. They need robust legislation.

A House of Lords report recently published lays out two possible futures: one in which the UK “becomes a world-leading home for responsible, legalised artificial intelligence (AI) development” and another in which it continues “to drift towards tacit acceptance of large-scale, unlicensed use of creative content”. One scenario protects UK artists, the other benefits global tech companies. To avoid a world of empty content, the choice is clear.

Standard
Artificial Intelligence, Arts, Books, Defence, Military, Science, Technology

Robocops to become part of UK’s defence vision

FUTURISTIC VISION FOR DEFENCE

Intro: Weapons technology scientists recruit sci-fi authors to prepare military for droid soldiers and AI

In the 1987 sci-fi blockbuster RoboCop, actor Peter Weller growled: “Dead or alive, you’re coming with me”. The idea of cyborg law enforcers roaming the streets was a fantasy.

Now, British military scientists believe AI-powered cops like those seen in the film could become a reality – and have teamed up with science fiction writers to create a vision of what that could look like.

The Defence Science and Technology Laboratory (DSTL) has unveiled Creative Futures, a book of short stories designed to inspire the developers of future weapons tech.

The collection, edited by Dr Allen Stroud of Coventry University, brings together authors and defence experts to imagine scenarios stretching as far forward as 2122.

Professor Tim Dafforn, the chief scientific adviser at the Ministry of Defence, said: “Innovation isn’t just about inventing new technology – it’s about understanding how it will be used, and by whom.

Fiction gives us the freedom to explore those scenarios in ways traditional analysis cannot, helping defence prepare for futures that are complex, contested, and unpredictable. If we only plan for what seems likely today, we will be blindsided tomorrow.”

The stories in Creative Futures explore how emerging tech, a changing society, and global challenges could shape the world of defence and security over the next 100 years.

They cover everything from robot policing and the rise of AI to quantum technology that can predict the future, and wars fought between autonomous machines – already seen with the use of drones in the Russia-Ukraine war.

The DSTL says one of its aims is to help Britain’s defence and security services avoid being taken by surprise by the use of tech in a conflict.

It believes that, by combining scientific expertise with storytelling, the short stories offer a “unique lens to consider alternative futures – both desirable and undesirable”.

The DSTL futures programme management team says the anthology is aimed to “engage, evoke, and provoke”, and in pushing defence scientists to “imagine new ways of working” and “rethink what the future could be”.

It says that preparing for the future means thinking beyond the next upgrade or system. Science fiction challenges us to consider the human, societal, and geopolitical dimensions of technology.

Dr Stroud said: “Science fiction isn’t just entertainment – it’s a strategic tool. These stories help us explore the risks and opportunities of emerging technologies beyond today’s horizon that we might otherwise miss.”

Creative Futures is available to buy online

Standard
Art, Artificial Intelligence, Arts, Culture, Society, Technology

AI-generic-slop is theft from real artists

CREATIVE ART

Intro: Art generated by online tools is painfully bland and is leading us down the path to cultural stagnation

Pablo Picasso, one of the most influential artists of the 20th century, admitted that “Great artists steal.” The Spanish genius assimilated African mask imagery into modern art, and many other greats throughout history have done something similar. Essentially, this is how creativity works. But behind their masterpieces are struggle, friction, and unique vision. Enter another entirely different beast, the theft by proliferating AI engines. These are killing creativity, harming real artists, and fuelling an epidemic of unoriginality.

By serving prompts to generators such as Midjourney or DALL-E, people can generate images on screen, in just a few seconds. Anyone can conjure up a Vincent van Gogh-styled still life or Leonardo da Vinci-inspired selfie and at once exhibit it online. Social media platforms such as X are filled with fans of this technology who declare: “AI art is art.” But this doesn’t make it true.

In fact, AI “art” doesn’t even exist – it is an illusion. AI models work on pattern recognition, not artistic decision making. While an “AI artist” may serve prompts to this technology, they cannot be considered the author of its output. It has simply been remixed from ready-made imagery without thinking, feeling, intent, or ingenuity. Absent from AI “art” is creative process, which should take more than a few seconds. This is apparent in the low-quality, generic slop that’s produced. Lacking a distinctiveness of style and voice, it can only offer a dynamic of smooth homogeneity.

It bypasses craft, which is what great artists develop – with brushes and paint, pencils and paper – over months, years, and even decades. AI artists celebrate the power of technology to make creativity accessible, and this forms their central argument and tenet as to why it’s so great. True craft, however, takes dedication, consistent practice, and experimentation.

John Constable not only worked tirelessly inside his studio but made countless studies en plein air – as revealed in Tate Britain’s current exhibition, Turner & Constable. Celebrating two of Britain’s greatest painters, it shows what being an artist really takes. On display are watercolours, oils and sketches, as well as paint-covered palettes, paintboxes, and even a sketching chair.

Among Constable’s masterpieces is his 1836 work Hampstead Heath with a Rainbow, where prismatic hues glide through menacing clouds. His technique looks effortless but was suffused with genius-level skill. And behind it, unseen by the average enthusiast, are more than 100 cloud studies he created in an attempt to capture their transient energy.

Where AI generates pictures in an instant, Constable was committed to an ongoing process; the experience gained through observation and documentation was ultimately of immense benefit to him.

Similarly, JMW Turner made around 37,000 sketches of landscapes he’d seen with his own eyes. Determined to evoke the raw power of nature – from blazing sunsets to howling storms – he pushed realism towards abstraction with an excitement that’s visible in his energetic brushstrokes.

In contrast to Constable and Turner’s radical compositions, AI’s aesthetic is flat, twee, and often old fashioned. Defined by a saccharine palette of candy colours and hazy tones, automatically generated landscapes are hollow, sanitised, and no match for Britain’s great painters and artists. Working some 200 years ago, they painted emotive, not idealised, places of both personal and historic significance.

What is more, both Constable and Turner began their paintings by looking, and really observing the world. This fundamental act is absent from the process of AI’s so-called artists who are more like a client giving instructions to a graphic designer than an artist painting at their easel. AI engines are also doing real harm to contemporary artists and their hard work.  

Among those who have already experienced its damaging effects is Australian painter Kim Leutwyler. She says her distinct style has been copied by app-generated portraits. “My issue isn’t with AI itself, but with the unethical way it has been trained without artists’ consent,” she said. “The right to opt in or out of having your data scraped for AI training should be fundamental, not optional.” This view is widely held across all of the creative industries.

AI, then, is pilfering from artists, the very people it relies on. It harms us all with its blandness. Rather than moving art forward, like Turner and Constable did in their day, it contributes to what has been termed “cultural stagnation”.

Anyone infuriated by Hollywood’s endless remakes of viewer favourites has a similar impact. It threatens both originality and individual thinking. And because future AI will only draw from more of this generated material, it will continue to create typical rather than unique visions.

AI art isn’t art, it’s a mirage, and it won’t be looked at for longer than a doom-scrolling second. In our world of efficiency and productivity, creative pursuits are one of very few remaining places where human endeavour is vital. Behind the brushstrokes of Turner and Constable are years of looking, thinking, making and struggle, and that’s what creative art is.

Standard