How an AI-written Book Shows why the Tech 'Frightens' Creatives
For Christmas I got an interesting present from a pal - my extremely own "very popular" book.
"Tech-Splaining for Dummies" (terrific title) bears my name and my picture on its cover, and it has glowing evaluations.
Yet it was totally written by AI, with a couple of easy prompts about me supplied by my friend Janet.
It's a fascinating read, and uproarious in parts. But it also meanders rather a lot, and is somewhere between a self-help book and a stream of anecdotes.
It mimics my chatty style of writing, however it's likewise a bit repetitive, and really verbose. It might have exceeded Janet's triggers in looking at data about me.
Several sentences start "as a leading innovation reporter ..." - cringe - which could have been scraped from an online bio.
There's also a mysterious, repetitive hallucination in the form of my cat (I have no family pets). And there's a metaphor on almost every page - some more random than others.
There are lots of companies online offering AI-book writing services. My book was from BookByAnyone.
When I called the president Adir Mashiach, based in Israel, he told me he had sold around 150,000 personalised books, generally in the US, because pivoting from assembling AI-generated travel guides in June 2024.
A paperback copy of your own 240-page long best-seller costs ₤ 26. The company uses its own AI tools to produce them, based on an open source big language design.
I'm not asking you to buy my book. Actually you can't - only Janet, who created it, can purchase any additional copies.
There is currently no barrier to anybody developing one in anyone's name, consisting of stars - although Mr Mashiach says there are guardrails around abusive material. Each book consists of a printed disclaimer specifying that it is fictional, created by AI, and designed "solely to bring humour and pleasure".
Legally, the copyright comes from the company, however Mr Mashiach stresses that the item is meant as a "customised gag gift", and the books do not get offered further.
He hopes to widen his range, clashofcryptos.trade producing various categories such as sci-fi, and maybe using an autobiography service. It's developed to be a light-hearted type of consumer AI - offering AI-generated items to human consumers.
It's likewise a bit frightening if, like me, you compose for a living. Not least because it probably took less than a minute to generate, and setiathome.berkeley.edu it does, definitely in some parts, sound much like me.
Musicians, authors, artists and stars worldwide have revealed alarm about their work being used to train generative AI tools that then churn out comparable content based upon it.
"We must be clear, when we are speaking about data here, we really indicate human developers' life works," says Ed Newton Rex, creator of Fairly Trained, which campaigns for AI firms to regard developers' rights.
"This is books, this is articles, this is pictures. It's works of art. It's records ... The whole point of AI training is to find out how to do something and then do more like that."
In 2023 a tune featuring AI-generated voices of Canadian singers Drake and The Weeknd went viral on social networks before being pulled from streaming platforms since it was not their work and they had not granted it. It didn't stop the track's developer trying to choose it for a Grammy award. And although the artists were fake, it was still hugely popular.
"I do not believe the use of generative AI for imaginative purposes should be banned, but I do think that generative AI for these purposes that is trained on people's work without consent need to be banned," Mr Newton Rex adds. "AI can be really powerful but let's develop it fairly and relatively."
OpenAI says Chinese rivals utilizing its work for forum.kepri.bawaslu.go.id their AI apps
DeepSeek: The Chinese AI app that has the world talking
China's DeepSeek AI shakes industry and damages America's swagger
In the UK some organisations - including the BBC - have actually chosen to block AI developers from trawling their online material for training purposes. Others have chosen to team up - the Financial Times has actually partnered with ChatGPT developer OpenAI for instance.
The UK government is considering an overhaul of the law that would enable AI designers to utilize developers' content on the internet to assist establish their models, unless the rights holders choose out.
Ed Newton Rex describes this as "madness".
He mentions that AI can make advances in areas like defence, healthcare and logistics without trawling the work of authors, journalists and artists.
"All of these things work without going and altering copyright law and ruining the incomes of the country's creatives," he argues.
Baroness Kidron, a crossbench peer in the House of Lords, is also strongly against removing copyright law for AI.
"Creative markets are wealth creators, 2.4 million jobs and a great deal of pleasure," states the Baroness, who is likewise a consultant to the Institute for Ethics in AI at Oxford University.
"The federal government is weakening one of its finest carrying out markets on the unclear guarantee of development."
A government representative said: "No relocation will be made till we are absolutely confident we have a practical strategy that provides each of our objectives: increased control for best holders to help them accredit their material, access to top quality material to train leading AI models in the UK, and more openness for ideal holders from AI developers."
Under the UK federal government's new AI plan, a nationwide information library containing public information from a wide range of sources will likewise be provided to AI scientists.
In the US the future of federal guidelines to manage AI is now up in the air following President Trump's return to the presidency.
In 2023 Biden signed an executive order that aimed to enhance the safety of AI with, to name a few things, companies in the sector needed to share information of the workings of their systems with the US federal government before they are launched.
But this has now been repealed by Trump. It stays to be seen what Trump will do rather, however he is stated to want the AI sector to deal with less policy.
This comes as a number of lawsuits versus AI companies, and especially against OpenAI, continue in the US. They have been secured by everyone from the New york city Times to authors, music labels, and even a comedian.
They declare that the AI firms broke the law when they took their content from the web without their permission, and used it to train their systems.
The AI companies argue that their actions fall under "fair usage" and are therefore exempt. There are a number of elements which can make up fair usage - it's not a straight-forward definition. But the AI sector is under increasing examination over how it collects training information and whether it should be spending for it.
If this wasn't all enough to consider, Chinese AI firm DeepSeek has actually shaken the sector over the previous week. It ended up being the most downloaded totally free app on Apple's US App Store.
DeepSeek claims that it established its innovation for a portion of the cost of the similarity OpenAI. Its success has raised security concerns in the US, and threatens American's current supremacy of the sector.
As for me and a career as an author, I think that at the minute, if I really desire a "bestseller" I'll still need to write it myself. If anything, Tech-Splaining for Dummies highlights the existing weakness in generative AI tools for bigger jobs. It has lots of errors and hallucinations, and it can be rather difficult to check out in parts since it's so long-winded.
But offered how quickly the tech is progressing, I'm unsure the length of time I can stay positive that my considerably slower human writing and modifying skills, are much better.
Sign up for our Tech Decoded to follow the greatest developments in worldwide innovation, with analysis from BBC correspondents all over the world.
Outside the UK? Register here.