
At the newspaper I worked at in the 1980s, there was the story of the longtime reporter who came in to work the day after computer terminals were installed in the newsroom. He went to his desk, where, in the spot where his typewriter had sat for decades, a giant new monitor with a blinking green cursor awaited.
He looked at the screen, picked up his bag, and said “I’m done,” never to return.
This could be the story of an older person failing to adapt to new technology. But it’s also a cautionary tale about how the creative process works and the tools we use. If the way you’ve always worked is dependent on seeing letters on a page, the liberal use of whiteout and having an end product you can hold in your hands, the ephemerality of digital dots on a screen is at best a speedbump that pushes you to evolve. At worst it neuters your ability to create.
Throughout the digital age, Big Tech has promised us products that will make us more efficient and save time, which, it is assumed, is always an obvious good. It’s a cliché that tools shape the things we make. And through most of our history, better tools have helped us create better things.
But what if this isn’t always true? Is publishing books better because getting words in front of people has never been easier? Have our communication skills improved because social media makes it easy for anyone to share the tiniest brain fart? Is music better because 50,000 new AI-generated songs are uploaded to streaming services every day?
Then there are headlines like this:
“Cheat” is an interesting word. Am I cheating when I use GPS to find my restaurant rather than by reading a map? Is it cheating to use a calculator when you fill out your tax return? Teachers who call using AI to do assignments cheating argue that the point of an assignment or test is to teach students how to think or develop skills for how to solve problems. Attaining those skills is the whole point, they say.
But I don’t need to understand how a car works to enjoy driving one. I think I know generally how computers work, but I don’t particularly need to understand machine language. A television director or actor or producer doesn’t need to know how photons make moving images, simply that the camera tool works. And besides, all those kids we told to learn to code to future-proof their careers? AI now writes 80 percent or more of all code and is far better at it; so there’s a job you probably don’t want to train for.
I’ve been playing with vibe-coding. Vibe-coding is when you describe to an AI an app or program you’d like to build and the AI just creates it. Instead of asking ChatGPT to make a PowerPoint slide or give an answer to a specific question, try asking it to make you a program that will not only make what you need now but will allow you to accomplish something over and over again as a bot or app (teach a man to fish…).
A step beyond that is the world of agentic AI in which you don’t even make the request for a program but describe what you’re looking for and AI agents get to work determining expertise needed and spinning up their own agents and bots who work together to best solve the challenge.
Suddenly my imagination is full of ideas about what I could build that were never possible before for lack of resources. A whole new ArtsJournal perhaps, individualized to every visitor. A digital twin into which I can load every piece of data, every article, video, pdf, spreadsheet and picture that pertains to my project and instead of having to search for information, I can just ask the AI to run analysis and give insights, which are constantly updating.
With each more advanced step, we get further away from the need to master the technical knowledge that used to be required to make things. No need to spend years learning how to orchestrate your brilliant music compositions when an AI can do it better. No need to spend tedious hours learning design principles when the AI can generate your ideas on the fly. No longer any need for a movie director to spend months (and millions of dollars) in post-production painstakingly bringing his imagination to life when an AI holodeck will at the click of a command set actors on the side of a mountain or a scenic vineyard even while you’re shooting the scene.
You could see this as a democratization of creativity. New platforms and tools open up participation to the masses. YouTube created more videographers than any film school ever could, in the process changing the aesthetic of video for everyone, maker and audience alike. And AI notionally offers the promise of empowering the creativity locked away in those who are blocked from realizing their imaginations because of lack of skills.
Is this cheating? Many artists think so, as if not having invested in acquiring traditional skills cheapens the art. I think it changes what’s possible, good and bad.
A hundred years ago traveling from Seattle to New York took weeks of sometimes arduous travel. Now you make the trip in a few hours. What a time-saver! And yet, you miss all the scenery between here and there. You don’t have the experiences you would have had, nor do you necessarily appreciate the physical distance to your destination. Then there are the environmental costs of flying. And the actual costs which previously restricted travel to those who could afford it. Surely those who made the difficult trip were changed by the experience in ways that we who now fly on a whim are not.
The fast-food restaurant explosion in the 1950s and 60s brought dependable, cheap food to the masses, and made it popularly desirable. But after a time, critics noticed that the food wasn’t very nutritious, didn’t taste all that good, and the experience was generic. Thus was born a backlash in the Slow Food movement, in which the culture of food was reasserted, the provenance of ingredients celebrated, and the experience of eating elevated. There’s a place for both.
There’s no question we lose something when tools make what was previously special, commonplace. But if the extraordinary becomes generic then extraordinary has to be, by definition, redefined. And that usually (but not always) means up.
But back to learning. If students using AI — a tool now available to everyone, about as ubiquitous as gps — is considered cheating, then the bar for learning how to think or reason has to be higher. Why wouldn’t a student use AI if it’s the fastest way to the answer? Critics have been complaining for years that schools have been turning out students ill-equipped for the world. So if an average person now has the tools to make a movie, design a house or create a company, what are the new essential skills a student needs to thrive?
We could apply the same question to artists. If AI makes something better, then why wouldn’t an artist use it? And many are. And most of us won’t be able to tell. But we also have to try to understand what’s gained and what’s lost as one set of skills is superseded by another, because skills change the art. And, like a reporter who can’t make the creative turn, some will be left behind.
The emergence of new tools doesn’t make previous tools illegal to use for artistic creation, though new tools may radically change how commercial art is produced. There are modern photographers, for example, using one of photography’s earliest processes, daguerreotype, to create new art today–check out Bing Danh and Jerry Spagnoli, for starters. Informed by what has come since, they use the old process for purposes probably not imagined by Louis Daguerre. AI will change how artists make art, and probably change what they do with the old tools too.
There is no pushback that would make sense. “Cheating” is, of course, a relative term — it means different things to different people. Ask the president; I’m sure he believes that cheating for his benefit isn’t cheating at all. Taking it a furlong further (as he always does), he believes that anything that does reverses his kakistocratic monarchy is wrong, and therefore cheating. And people believe him.
AI is here. It’s not going anywhere. Some may view it as Pandora’s box having been opened. To them I can only say that if there was a moment in time where social media could have been stopped (knowing what we know now about it), would it have been stopped?
Probably not. AI might not serve my world on a major basis, but it’s in existence and supported. The only question I’d have for the educational governors is this: would you want a doctor who received their license by using AI on all their tests in medical school operating on you? Or is that too hyperbolic?