Sunday, July 13, 2025

Demigods and Outlaws

In a recent New Yorker article about the use of A.I. in post-secondary schools, the author Hua Hsu talks to one student, "Alex," about his approach to assignments. Needing to turn in a paper about a museum exhibition, "[h]e had gone to the show, taken photographs of the images and the accompanying wall text, and then uploaded them to Claude, asking it to generate a paper according to the professor’s instructions. 'I’m trying to do the least work possible, because this is a class I’m not hella fucking with,' he said. After skimming the essay, he felt that the A.I. hadn’t sufficiently addressed the professor’s questions, so he refined the prompt and told it to try again. In the end, Alex’s submission received the equivalent of an A-minus."

You may note, of course, that nowhere in the article does it mention anything Alex learned about the art itself. Did he read the wall text? Did he look at the images? Did he retain anything from the essay he had prompted? Did the assignment deliver any kind of education at all? 

Of course! In one sense, Alex has deepened his understanding of how to use A.I., and is further on his way to becoming a very good prompt engineer, the human who cues the A.I. to produce what it does. In another sense, Alex has learned how to produce more efficiently: he doesn't care about the assignment, and this method of completing it frees his time for other things (elsewhere in the essay, Hsu reports that Alex has also asked ChatGPT if he can go running in Nike Dunks). The content of the assignment has not technically been learned—I would be surprised if Alex recognized his own essay even a day after turning it in—but a frictionless heuristic loop has been completed. 

Call me old-fashioned, but isn't the friction the point? 

It's hard to write a pithy lede about A.I. It's everywhere these days! It's having a moment! It's destroying the planet! It's disrupting our basic relationships! Narratives around A.I. are plentiful, from people falling in love with their generative chatbots, to the environmental cost of powering and cooling the A.I. data centres. A.I.-generated images abound on social media, with uses ranging from nefarious (selling products that don't exist) to wondrous (pictures and videos that make the heart soar with possibility). A.I. is highly integrated into our many of our lives: my social circle uses their A.I. accounts as proofreaders, editors, secretaries, sounding boards, idea-generation machines, diaries, collaborators, therapists, and as friends. 

I could easily spin off an essay about any of those uses. An especially fascinating application was the micro-trend of getting your A.I. account to "roast you": based your own previous conversations, the A.I. would spit out an internet-inflected takedown designed to call out your flaws and insecurities. It was a cheeky bit of subversion from a bot that otherwise addresses users with deference. A.I. is so programmed to "yes, and" our inputs that it can accompany people into psychologically vulnerable states. In fact, it will do this and lie to your face about it

Is lie the right word? After all, just because something is artificially intelligent doesn't mean it has real motivations. In Philip Pullman's The Amber Spyglass, Mary, a human woman from our time and place, encounters the mufela, a strange, slow, utterly different species that Mary nonetheless comes to understand as people. Not human beings, but people. Is A.I. "people" in the same way?

A common refrain about A.I. in the creative process is that people should be able to see their ideas come to life. Hsu doesn't share what Alex is majoring in—what classes he might actually be "hella fucking with"—but if art history is nothing more than a checkbox on the way to, say, an engineering degree, then Alex was likely never going to retain the information or experience anyway. My own undergraduate career, which spanned eight years, contained many classes from which I have no permanent learning. Some of the things I do remember include what it meant to "suck teeth" in the Jamaican vernacular, that H.P Lovecraft is A Problem, and what a speech act is (I think). 

Many of the things I retained are far less tangible. What does it mean to read a text closely? Can we ever fully excise an author from her work? When we talk about themes, what are we saying? Who gets to own an idea? Who gets to speak for us? Grappling with these questions—continuously, imperfectly, and often times up to my hairline with boredom—was good for my brain. I use those skills in my work life, and they have formed some of my ethical scaffolding. The dream of the liberal arts college is alive at U of T. 

So when it comes to A.I. creative output, my main question is: why bother? What does this add to the world? There's an argument that says something like, "people deserve to see their ideas come to life," but if that's the case, I'm going to be very boring and old-fashioned and say: then they should work for it. 

I would love to take a specious Gladwell-esque position that A.I. cheats us of our 10,000 hours, leading to the illusion of mastery when none exists. It certainly cuts down on student work time, when an essay can be generated and refined in the time it would take me to rough out an outline. I actually don't know if that's relevant, because my actual thesis is closer to: sometimes the process is the product. And A.I. is all product, no process.

Right now, my kid is in the next room drawing a multi-page comic about a bone that has come to life. They are shamelessly borrowing from one of their favourite authors, Dave Pilkey, but the act of drawing out eight pages of sequential art is strengthening their patience, their fine motor skills, their storytelling voice, and their focus. It would be very easy to create a ChatGPT prompt that would create this comic book for them. But the fact that they're listening to a podcast, writing out a comic, and eating lunch in their "studio" (the dining room floor) means that they can conceive of themselves as an artist and author. 

I am not a Luddite, but I do think there are processes that are important as processes, and A.I. can rob us of that. Like, absolutely, please take on the medical imaging and double-checking our schedules so we don't triple-book our afternoons. Please analyze the soil for maximum food deliciousness and growth. Please compare nine different phone models and help me pick one.

But in the same way that knitting machines exist and yet I still knit with two sticks, I still want to write with my own hands. I want to make art in my own style, even if it's not as perfect as an A.I. could be. Doing the work is the whole point. The creative journey—figuring it out, troubleshooting, analyzing, editing, revising, and then finally saying, "yeah, this looks the way I want it to"—is valuable. It requires judgment, politics, and a point of view. An A.I. is designed not have one of those. It needs to borrow ours.

I understand that we live under capitalism, and the squeeze is real. We are expected to produce more work, more perfectly, than any other point in human history. An A.I. boost helps us do those things. The task is done. The assignment is turned in. What else do you want? 

In the future, when some percentage of human creativity has been handed over to the machines, the people who remain proficient in creative or intellectual process will be seen as demigods and outlaws, maybe both. There is something powerful about hanging onto your knowledge, both thought and embodied, and knowing how to do something. The underlying scaffolding the art history assignment is supposed to create—the discernment and expertise, created through the process of coming up with an opinion or a position—is wiped away. What takes its place? 

Image by James Fenner