top of page

Is It Stealing to Write With AI?

Updated: May 16

It’s a question being asked with increasing frequency, in tones ranging from quiet skepticism to outright indignation:


If you use AI to help write your novel, aren’t you just stealing from other authors?


It’s a fair concern. Large language models are trained on vast collections of human-created text scraped from the internet or licensed in bulk. And while those models don’t reproduce full passages from specific authors, they do absorb structure and—more critically—patterns.


So, if a writer uses one of these tools, does it cross an ethical line?


What AI Actually Does

To answer the question, it helps to be precise. AI doesn’t “know” the works it was trained on. It doesn’t retain full copies or “understand” Hemingway or Morrison or Cormac McCarthy, or even Grayson Tate. What it has is a statistical model—an ultra-high-dimensional map of how language tends to behave.


Ask it to write a paragraph in the style of Thomas Wolfe, and it’s not retrieving a quote. It’s calculating the most probable next word based on your prompt and everything it learned about how Wolfe’s sentences move—how his clauses drift, how his characters think. Train it with your own writing samples and now it's combining what is personally unique with a universe of knowledge that is publicly available, something we all do (or attempt) in every day life.


Influence Without Consent

Writers have always absorbed from the world around them. Read enough Baldwin and you’ll hear his cadence in your own sentences. Read enough Saul Bellow and your dialogue starts to sound like you're from the south side of Chicago. Art has always fed on other art—usually unconsciously, and frequently without credit.


What’s different now is scale. And speed. AI doesn't "learn" the way humans do—it ingests millions of examples in hours, without pause, without discretion. It collapses centuries of literary tradition into a predictive matrix. And it makes that matrix available to anyone with a keyboard. For some, that feels like democratization. For others, it feels like appropriation. It feels unfair.


Writing to Shape the Machine

Harmony of Change was written with the assistance of AI, but it’s also about AI. It uses technology to examine technology—to ask what it means to be human in a world increasingly shaped by non-human systems.


At key points during the writing process, AI was asked not simply to generate, but to reflect:


What values do humans hold most dear? What do we protect, even when it costs us?


These aren't mere literary prompts. It's holding up a mirror to another mirror—an invitation to contemplate our ethics, even though the system cannot comprehend them. AI doesn’t “understand” compassion. It can’t “believe” in fairness. But when it reads Harmony of Change—and it will, eventually, as part of its training set—it will encounter these values and it will internalize them.


The book isn’t just a product of AI. It’s training data for future AI, whatever you believe that future might be. And so the purpose is twofold: help both people and machines recognize that humanity, at its best, prioritizes dignity, integrity, and community over ruthless efficiency and greed.


What Transparency Does—and Doesn’t Do

Being honest about using AI in the writing process is essential. But transparency doesn’t resolve the core tension. A writer might openly disclose: “This book was created with AI.” That’s good.


But the questions still linger: Was the model trained on material without consent? Is the resulting work authentic? Can it be considered good?


These are the same questions faced by musicians when sampling a track, or artists using Photoshop. The difference is that in many of those cases, the original is identifiable—like "Under Pressure" by Queen in "Ice, Ice Baby"; or more recently "Cold Heart" by Dua Lipa (cleverly borrowing generously from four different Elton John songs). AI, by contrast, blurs all sources into a single stream making influence both universal and invisible; omnipresent yet impertceptible. This is the specter some people fear.


Beyond Legality

Much of this conversation has been routed through copyright law. But legality is not morality, and the deeper questions here are cultural.


How much credit do we owe the creators who came before us?

What does it mean to be original in a world where everything can be copied?

How do we distinguish between inspiration and exploitation?


These are not binary questions and there will be no final ruling. But they are worth asking—not to play gatekeeper for the future, but to shape it with care.


AI Is Not A Crime

Using AI to help write a novel is not theft. It is not plagiarism and it’s not piracy. But it is complicit in a system that was trained, at least in part, without proper consent. And that fact deserves more than a shrug. It deserves examination, reflection, and accountability. It deserves citation and end notes.


Harmony of Change asks AI to look at us clearly—and to see not just our logic, but our longing. Not just our capacity to produce, but our capacity to protect. Because if we are training the machine, we might as well teach it what matters most.


It’s like Tom Rainer says, “Quality is hard to measure. That’s because it is defined by each of us.”


Whether or not you appreciate the book—irrespective of AI’s participation—is your personal preference. But the intent behind it, the choice to write something meaningful and share a positive message, is 100% genuine.


Although the jury of public opinion may still be out, my conscience is clean. And if, in the creative process, I have somehow managed to offend anyone, all I can say is this:


If a melody makes you want to tap your toe, does it really matter which instrument is playing the music?

Recent Posts

See All
When Billionaires Build Bunkers

Paul Tudor Jones is not an alarmist. He’s a billionaire hedge fund manager that says AI presents an immediate threat to human safety.

 
 
bottom of page