Is AI Bad for the Environment?
- Grayson Tate
- Apr 22
- 2 min read
Updated: May 21
When I started writing Harmony of Change, I wasn’t thinking about the carbon footprint of fiction. I was curious what it meant to write about technology using technology.
What I didn’t ask—at least not at the time—was the cost. Not in dollars, but in resources. Lately, I’ve been thinking about that more. And it turns out, writing with AI isn’t as weightless as it might seem.
Every Prompt Has a Price
We often talk about artificial intelligence as if it lives in the cloud—light and frictionless. But behind every AI-generated suggestion is a very real infrastructure: industrial data centers running 24/7, cooled by massive volumes of water and often powered by fossil fuels.
Researchers estimate that training a single large AI model can emit as much carbon dioxide as five cars over their entire lifetimes. And that doesn’t account for the emissions from everyday use—the millions of prompts and queries that models process daily.
The intelligence that makes models like ChatGPT feel conversational isn’t just raw data—it’s called reinforcement learning, a technique that allows models to adjust based on human input. It’s a perpetual feedback loop of trial and error that requires enormous processing power. It’s what helps AI respond more naturally, refine its tone, and even avoid offensive content. But it’s also incredibly expensive—computationally and environmentally.
The Hardware Arms Race
Until recently, companies like OpenAI and Anthropic relied almost exclusively on NVIDIA's GPUs to handle the enormous workloads required for training and fine-tuning AI models. For instance, NVIDIA's DGX Spark are capable of trillions of operations per second (teraFLOPs) and even quadrillions (petaFLOPs) in some of the most powerful supercomputers. These chips are remarkably fast—but they’re also high-priced and power-hungry.
Enter DeepSeek, a Chinese AI company that has begun building and training large models on non-NVIDIA architectures. They’ve invested in open-source innovation, developed alternative chip strategies, and are pushing to reduce the computational overhead of both training and inference. If they succeed—and early signs suggest they might—they could rewrite the economics of AI development entirely.
If AI systems can be trained more efficiently, on lower-power hardware, or with smarter software pipelines, then we may be able to code and create without quietly warming the planet in the process. But it's still early—and the incentives are misaligned. What drives development now isn’t environmental responsibility, it’s market dominance.
Paper for Carbon
So, did writing my novel with the help of AI contribute to environmental degradation?
The honest answer is: yes. Maybe not as much as building a data center or minting NFTs. But still—my project required computing resources that someone had to generate, cool, and maintain. And those systems don’t run on nothing.
It’s troubling to think that a book written about hope, integrity, and ethics was created using systems that are accelerating planetary decline. But then creative works have always consumed resources. It used to be ink and paper, now it’s silicon and carbon.
The point isn’t to renounce the tools—it’s to recognize their cost, both socially and environmentally. As creators, we’ve entered a new era of authorship, one that includes not just the words we write, but how they are written. And if we care about what our stories say, shouldn’t we care just as much about the world they live in?