top of page
AINews (3).png

How AI Written Books Are Shaking the Publishing Industry

  • Writer: Covertly AI
    Covertly AI
  • 22 hours ago
  • 4 min read

The publishing world is facing a growing problem that feels both immediate and unsettling: books suspected of being written with artificial intelligence are already slipping into the market, and many in the industry are not confident they can stop it. That anxiety exploded with the controversy surrounding Shy Girl, a horror novel by Mia Ballard that was first self published in February 2025, later released in the UK in November 2025, and then set for a US publication before Hachette cancelled it. After conducting an internal review, Hachette pulled the novel from its US release schedule, removed it from online retailers and its own website, and decided to discontinue the UK edition as well. The case has been widely described as the first commercial novel from a major publishing house to be pulled over evidence of AI use.


The situation has rattled publishers, agents, and authors because it exposes how difficult it has become to tell where human writing ends and AI assistance begins. Literary agent Kate Nash said she had started noticing that submission letters were becoming more polished but also more formulaic, until one author accidentally left an AI prompt at the top of a query letter. That moment changed how she viewed the flood of material arriving in her inbox. For many in publishing, the Shy Girl controversy confirmed fears that it was only a matter of time before AI assisted or AI generated work made its way through traditional gatekeeping systems. One editor at a major publishing house admitted the story sent a cold shiver through the industry because it showed that even careful review processes can fail.


What made the controversy even more significant was the visibility of the book itself. Shy Girl had been marketed as a bold feminist horror novel and had already built a readership, selling more than 1,800 print copies in the UK and attracting more than 4,900 Goodreads reviews. Its plot follows Gia, a lonely and financially struggling young woman who becomes entangled in a disturbing arrangement with a wealthy man she meets online. Although the book initially received praise, criticism grew as more readers described the writing as confusing, repetitive, oddly formatted, and filled with phrases they felt sounded machine made. Some readers openly claimed it looked like something written by ChatGPT, and reports later suggested the book could be as much as 78% AI generated.



Ballard has denied using AI to write the novel herself. She said that an editor she hired for the original self published version used AI during the editing process, and she has argued that the fallout has severely damaged both her reputation and her mental health. She also said she was pursuing legal action. Her defence adds another layer to the debate because it raises difficult questions about authorship, responsibility, and whether AI used in editing should be treated the same way as AI used to generate large portions of a manuscript. In an industry now operating in an increasingly hybrid writing environment, that line is becoming harder to define.


Experts say the problem is not likely to get easier. Computer scientist Patrick Juola argued that AI detection tools are fundamentally unreliable, while Cornell Tech professor Mor Naaman warned that AI is learning quickly enough that publishers may soon have little chance of spotting machine generated writing at all. Other researchers noted that determined writers can repeatedly edit AI assisted text until it avoids detection, creating a murky middle ground between fully human work and obvious machine output. Publishers may require contracts, declarations, and screening tools, but many now recognize that none of those protections are foolproof. That is why the Society of Authors recently launched the Human Authored scheme, an effort to help readers identify books written by humans, even though it ultimately depends on trust.


At the heart of the debate is a larger cultural concern. For many writers and publishers, this is not only about originality or rule breaking. It is about trust between writers, publishers, and readers, and about whether literature can still reflect the messy, difficult, deeply human experiences that make books meaningful. Critics fear that AI could flood the market with bland, formulaic work while also reducing opportunities for emerging authors to develop their voices. The Shy Girl controversy may be only one case, but it has become a warning sign for an industry trying to protect not just books, but the human creativity and trust that give literature its value.


Works Cited


Glynn, Paul. “Publisher Cancels Horror Novel’s Release over AI Claims.” BBC News, 20 Mar. 2026, www.bbc.com/news/articles/c5y9d44jj24o


Hill, Amelia. “‘Soon Publishers Won’t Stand a Chance’: Literary World in Struggle to Detect AI Written Books.” The Guardian, 29 Mar. 2026, www.theguardian.com/technology/2026/mar/29/ai-written-books-novel-shy-girl-publishers


Phillipp, Charlotte. “Publisher Cancels Shy Girl Horror Novel After Writer Accused of Using AI.” People, 20 Mar. 2026, people.com/publisher-cancels-shy-girl-horror-novel-ai-scandal-11930912


Croft, Alex. “Horror Novel Reportedly Pulled from Publication after Suspected Use of AI during Writing Process.” The Independent, 20 Mar. 2026, www.independent.co.uk/news/world/americas/horror-novel-pulled-ai-shy-girl-hachette-b2942579.html


Lane, Jaqui. “Writing and Self Publishing in the AI Era.” LinkedIn, 6 Dec. 2025, www.linkedin.com/pulse/writing-self-publishing-ai-era-jaqui-lane-d3clc

Comments


Subscribe to Our Newsletter

  • Instagram
  • Twitter
bottom of page