If Andrew Tate wrote a book about how to make your wife or girlfriend into your slave, would he be within his rights to demand that no woman reads that book without his consent?
Brandon Sanderson was inspired to become a fantasy writer when, as a child, he read Dragonsbane by Barbara Hambly. Sanderson is now worth some seven or eight figures, while Hambly, who is still alive and still writing, struggles to pay her bills*. Should Hambly be entitled to a portion of Sanderson’s earnings, for inspiring him to become a fantasy writer?
Every mother who has ever lived gives tremendously of herself to her children, even if only in the physical act of giving birth. Should mothers have a legal claim on their children, for monetary compensation for all of the sacrifices they make?
These might seem like crazy questions, but when you consider them in the context of the ethical arguments about AI art and AI writing, they really aren’t. They illustrate just a few of the unintended consequences of the regime that many disgruntled and resentful creators are arguing for, when really what they want is a world in which AI doesn’t exist.
One of the most difficult parts of being a creator is putting your work out into the world and letting it go. At that point, you really have little control over what it does and how it impacts the world. Many artists who labor in obscurity dream of making an impact on the world, not realizing that success—even artistic success—can be far more devastating and traumatic than obscurity. After all, just ask Rachel Zegler about that now.
I’m not saying that artists shouldn’t be paid for their work. Certainly they should be paid—and certainly there are valid ethical concerns with how AI is disrupting art and literature. But unhinged people who rant online about how AI is “stealing” artists’ work, or how it is “plagiarizing” writers’ books, simply because the LLM’s training data includes free online content (much of which was posted online by said artists and writers)—I don’t think those people really care about the ethical nuances of the debate. I think they just want to force us all to go back to a world where generative AI doesn’t exist.
Did David Weber steal from Star Trek when he wrote the first Honorverse novel? Did John Scalzi steal from Robert A. Heinlein and Joe Haldeman when he wrote Old Man’s War? Did Terry Brooks steal from Tolkien? How about George R.R. Martin?
Where exactly is the line between the “stealing” that should get you thrown in prison, and the “stealing” that people wink and nod at when they say that good artists copy and great artists steal? And how do we know that we’ve drawn the line in the right place? Would we have worse art, or better art if Star Wars had gone into the public domain in the 80s or 90s? Would artists be making less money, or more?
I don’t have the answers to these questions, but I ask them because I think they are worth considering. And I think that most of the artists who think they have the answers are really just acting out of fear.
Will AI outright replace artists and writers? Will it make it impossible for artists and writers to make a living? I remain skeptical, though I acknowledge that there are some ways in which AI art appears to be doing exactly that. For example, I’ve been playing around with OpenAI’s new image generator, making some cover mock-ups, and I’ve been very impressed. But I will still seek out James at GoOnWrite.com for my covers, because he has a much better eye for this sort of thing, and my sales data reflects that his covers sell more of my books than my own covers do.
Should writers and artists expect to be paid whenever their art is used to train an LLM? Aside from the impracticality of enforcing such a law, I don’t think that we should—at least, not for general training data. Fine tuning is a different matter. If an AI is going to be fine-tuned to write in my particular style, I think I have a right to be recompensed for that—and I’d be willing to license that right for a reasonable fee. Perhaps this is a path that artists could pursue as well. But demanding that every AI company pay every artist for training their LLMs is kind of like Barbara Hambly demanding that Brandon Sanderson pay her a portion of his earnings. Likewise, whenever artists or writers demand that their intellectual property is excluded from the training data, it smacks to me of the first question with Andrew Tate and his hypothetical book.
I will admit that I’m biased in favor of AI, since for the last two years I’ve been working to incorporate it into my own creative process. But I’ve been doing this out of a recognition that these things we call “writing” or “making art” is going to change because of these new technologies. In a world saturated with AI, will it still be possible to make a living as an artist or a writer? Yes, I believe it will, but at the same time, I believe that our conception of what it means to be an “artist” or a “writer” will almost certainly change. That’s why I’ve chosen to embrace these tools, rather than fight them—and why I think my fellow artists and writers should as well.
*At CONduit 2010 in Salt Lake City, Barbara Hambly was the guest of honor, and in her keynote address she talked about her struggles to pay her bills with writing. I assume that things haven’t changed much in the years since then, though I would be delighted to learn that I’m wrong.