Throughout history, music has embraced constructive change and innovation. And we will do so again as we confront the opportunities and risks of artificial intelligence.
Done right, AI should offer avenues for new growth and artistic accomplishment. When creators’ rights are respected, innovation thrives.
Already, music companies have unveiled compelling projects that use AI technologies in groundbreaking ways — with full consent and participation of the artists and rights holders involved. Working together with responsible AI companies, music companies are finding new ways to enhance production and marketing, gain new understandings from data and research, and improve wellness and health. They’ve used it to help identify new audiences for artists and pioneer new ways to celebrate iconic catalogs and performers. This is just the beginning of a new era of possibilities.
But many AI developers are resisting collaborative efforts by the creative sector to develop a responsible policy framework for AI, even though the elements of such a framework are straightforward and common-sense. In short, AI companies must honor:
- Authorization: only use copyrighted music if it is authorized (for example, through a license)
- Transparency: keep and disclose adequately detailed records of the content on which they train their systems
- Authenticity: prevent deepfakes, voice clones, and similar violations of individuals’ rights in their own voice, image, name likeness and identity.
These foundational, consensus principles are detailed by the Human Artistry Campaign and supported by virtually the entire creative community. They set forth a baseline for responsible development and deployment of AI.
But as if on cue, some of the worst instincts of Big Technology have returned. Some AI developers claim it’s “fair use” to scrape up protected music so it can be copied and repackaged by their models. That’s just wrong.
Put bluntly, that’s digital theft.
In every legitimate market in the world, the use of others’ property requires the owner’s consent and agreed-upon compensation. Together, for example, music and technology have developed a burgeoning streaming market built on the common-sense principle that use of copyrighted creative works requires licensing and consent.
Indeed, the developers’ claim that they can use decades’ worth of iconic and extremely valuable recordings for AI without bothering to ask or pay the rightsholders is so far-fetched that former Stability AI developer Ed Newton-Rex quit his job in November rather than be party to an extreme effort to rip off artists and misappropriate their work, explaining via X:
“Companies worth billions of dollars are, without permission, training generative AI models on creators’ works, which are then being used to create new content that in many cases can compete with the original works. I don’t see how this can be acceptable[.]”
This is why transparency is essential. AI developers must keep accurate records of the copyrighted works used by their models and make them available to rights holders seeking to enforce their rights. We need rules requiring that developers maintain adequately detailed records and share this information — or bear the consequences if they fail to produce it. We were pleased to see that the European Union enshrined this as a core principle in its landmark AI Act.
AI policy must also establish clear rules protecting every performer’s right to their own voice, image, name and likeness — the most fundamental cornerstones of individual identity. AI fakes that mine an artist’s body of work to create artificial replicas and voice clones, fashion phony endorsements, or depict individuals in ways they haven’t consented to represent the worst kind of personal invasion. Congress needs to put an end to wrongful appropriation of the most central components of individual human identity.
These are the challenges of 2024.
We either work to continue a strong and sustainable foundation for music in the era of generative AI that moves both art and technology forward together, or generative AI devolves into just another “move fast and break things” novelty that fails to deliver anything of value while eroding our culture.
These are the choices policymakers will face this coming year. Let’s work to help them forge the right path.
Mitch Glazier is chairman/CEO of the Recording Industry Association of America.