
Everywhere we look, we’re confronted now with AI. Or at least the claims of AI. Your WORD document offers to draft something for you. Your GMail promises to better organize your inbox. Your Reels and TikToks are flooded with annoying ludicrous AI video. AI promises to change everything. Even if you don’t want it to.
But there probably isn’t anyone in creative industries not focused on the threats to artists and organizations posed by AI. For the past eight months I’ve been working on a research project for the US Regional Arts Organizations to track and map issues around creativity and the adoption of AI. The research will come out sometime in the spring, but I thought I’d start to share some preliminary thinking, and I’ll follow up regularly over the next several months with more insights.
The first is that the pace of development and the issues arising with adoption of AI are moving so fast, that tracking them is problematic. It’s far from a linear process. Discussions of issues are quickly superseded by conceptual shifts that new AI capabilities disrupt. For example, notions of ownership of creative work, ideas, and artistic identity are muddied when the technology rapidly outpaces attempts to define issues and even what’s at stake.
That’s because AI is arriving in the arts with the force of a structural change, not a new toolset. It promises new forms of expression, but it also threatens the foundations of creative life: ownership, attribution, labor, compensation, cultural equity, and trust. While all industries are racing to understand and adopt AI, in the creative sector, those most directly affected—artists—are often absent from the conversations where decisions and policy are being made. That’s because running an arts organization or really any creative enterprise takes so much effort, that few have time to focus on what seem like such enormous abstract issues. Meanwhile, artists watch their work get absorbed into datasets, see their livelihoods threatened, and their communities flooded with synthetic content that mimics style without being informed by intent.
So this research project attempts to understand the implications of AI on creative industries and artists not simply as technology to “use” or regulate (or cool toys to chase after and be distracted by), but as a reordering of how creative value is produced, circulated, and protected. The aim is to try to cut through the noise and name the underlying stakes, so that participants in creative industries can better understand and help shape the emerging systems rather than be subsumed by them.
That said, we zoomed out from trying to chase issues, to reconceiving the project as a framework of values locating issues in three interconnected spheres:
1. Systems & Value
AI is reorganizing the economics of creativity. It threatens traditional revenue models, accelerates automation of creative tasks, and concentrates power in large platforms that rely on uncredited human work. At the same time, it opens possibilities for new value chains, new forms of attribution, and artist-led economic models. We are attempting to map where artists are at risk, where leverage exists, and where new frameworks could emerge.
2. Tools, Skills & Practice
AI reshapes the studio itself. It blurs lines between skill and automation, raises questions about authorship, and changes how performers, writers, designers, and makers collaborate with machines. Some aspects of craft may be devalued; others grow in importance—taste, context, curation, storytelling, embodied knowledge. We want to track these changes from inside artistic practice, grounding policy conversations in creative reality.
3. Culture, Identity & Public Trust
AI makes it trivial to fabricate images, voices, personas, and histories. This creates opportunities for imaginative world-building—but also risks erosion of trust, identity distortion, and cultural homogenization driven by systems trained on unbalanced or unethically gathered data. We are examining how to navigate issues of provenance, authenticity, equity, and the cultural narratives that shape collective meaning.
The digital revolution of the past 25 years holds important lessons about what happens when artists are not at the core of working through transformational issues. In the absence of meaningful power, Big Tech and their platforms write their own rules for culture, inserting themselves in the middle, enacting tolls and determine how culture is distributed and seen and how artists are valued and compensated for their work. That hasn’t worked out very well.
At the same time, the sophistication of lawmakers in being able to understand the technologies, let alone propose or enact legislation and rules that both protect people and allow for innovation, is low, as witnessed by their inability to enact rules for social media platforms.
The threats are real and immediate: extraction without consent, job displacement, ownership confusion, and systems optimized for scale over nuance. An added complication is that we begin from an imperfect place. The systems of ownership and equity and value we currently have have been far from perfect and are hard to defend. Copyright, for example, one of the principle creative protections, has been infuriatingly broken for decades. So getting to a better system while the current imperfect system is changing beneath us, is a challenge.
But perhaps if we can do a better job of defining what’s really at stake, we’ll have a better chance of a better creative environment.

Leave a Reply