Six months have passed since the viral “Ghibli trend” took over the internet. Last spring, myriads of people tasked ChatGPT with an unusual duty: alchemize their pictures into AI-generated illustrations in the iconic style of Studio Ghibli, a Japanese animation powerhouse.
Unfortunately, this innocent-looking craze is far from harmless.
Besides submerging OpenAI’s infrastructure and melting their GPUs, squandering substantial energy in the process, it raised concerns about artistic integrity and copyright infringement.
For, in order to mimic Studio Ghibli’s whimsical art style (or any artist’s brushwork, for that matter), AI models analyze artists’ works en masse, such as frames from their movies. This is where the main issue resides: animation studios never gave consent for their artworks to be employed for algorithm training. In fact, most artists stand fiercely against this notion.
Moreover, when Hayao Miyazaki, the founding father of Studio Ghibli, was shown an AI-generated animation in 2016, he declared, frowning, that it was “an insult to life itself.”
An insult to life, and to hardworking artists. Charmed by its low costs and quick results, many companies are now leaning on AI to generate their logos, posters, illustrations, and even films, instead of hiring experienced designers.
Imagine for a second: you are a passionate artist, devoting your waking hours to the honing of your craft. Suddenly, you find yourself ousted from your position and supplanted by a linear-algebra-powered entity that can only synthesize what it already knows instead of creating something new. Depressing, right?
This plight doesn’t only concern visual artists: the AI tide engulfed the music industry too. Many people dread a potential decline in demand for human-made compositions. And considering AI models spawn music based on existing melodies, it is unclear who really owns these creations.
So, in addition to robbing artists of their jobs, the use of AI in art can also ignite legal conflicts over thievery or imitation of original work. As you can see, there’s loads of potential for scandal here.
Speaking of scandals, a deluge of self-published books that include unedited AI prompts has been reported recently. The one that received the most attention was the second volume of a fantasy series titled “Darkhollow Academy.” Inside, readers unearthed the following line nestled between two paragraphs: “I’ve rewritten the passage to align more with J. Bree’s style.” Well, thank you, Chat.
Jokes aside, this is actually very concerning. The fact that some people use ChatGPT to copy other authors’ styles is a serious threat to intellectual property in literature. It reveals that plagiarism has become commonplace and that little is done to protect established writers from being stripped of recognition for the literary voice they have spent decades refining.
But AI isn’t solely used to rewrite passages in someone else’s style.
Several online platforms, like Amazon, are flooded with books entirely generated by AI. Consider this: people are reaping substantial profits by selling books they didn’t even write! To make things worse, AI models are trained on existing titles, and authors have no way to contest this. Their writing feeds the monster that menaces them.
Nonetheless, let’s not stumble into extremes. This article is in no way meant to be a bitter diatribe against artificial intelligence as a whole. Assistive AI (as opposed to generative AI, which creates content from scratch) is a revolutionary tool that, when used correctly, can increase our efficiency. Many publishing houses, for instance, are already using assistive AI for marketing (especially content creation and market research), and editing through software like Grammarly and ProWritingAid. These programs make editors’ work easier and quicker. Assistive AI can also be a lifesaver for indie writers who cannot afford the cost of working with a traditional publishing house. It enables them to correct, perfect, self-publish, and sell their work despite financial difficulties.
So…Is AI a danger to artists? The answer is yes, if we are talking about generative AI. However, when used as a tool, AI facilitates artists’ tasks and permits them to achieve more in less time. Therefore, it should be regulated so that it remains a tool and nothing more. Also, a legal framework must be established to empower artists who refuse the exploitation of their work for algorithm training. And finally, governments should limit individual use of AI tools to curb ecological damage and prevent profligate depletion of energy.