Image by Janus Rose, generated with Stable Diffusion
To prepare for the 2023 spring semester, New York University professor Winnie Song did something she’s never had to do before: she created AI art guidelines for her students.
Song, an assistant arts professor in the Game Center at NYU’s Tisch School of the Arts, is not the only art instructor thinking about this. With the rapid rise of automated systems like Stable Diffusion, Midjourney, and DALL-E 2 within the past year, instructors at post-secondary art institutions are trying to figure out how to broach the topic with their students while still learning the intricacies of AI art themselves.
“My worry was that they would use the AI generators to come up with mood boards and references of things that don’t exist in real life. So I just set a policy where, within the bounds of this class, it’s discouraged to use the generators,” Song told Motherboard. “I really didn’t ever imagine that it would get to this point where people would be, like, trying to legitimize it as a craft.”
AI-generated art has flooded the internet since users began generating elaborate images with just a written phrase or highly stylized portraits by uploading a selfie. The tools have been met with fierce backlash from many artists, who note that the AI systems produce derivative images after ingesting millions of original artworks without permission from their creators.
But while the growing sophistication of AI generators is raising profound questions about the nature of art and the creative process, it is also creating very tangible dilemmas for art educators who want their students to develop skills that go beyond typing a phrase into a text prompt and turning it in as their own work.
“I think we endeavor to teach them to become independent of tools and also make sure that they remain sort of agnostic, not reverent and dependent on one thing to get presentable work,” Song said. “You can learn this, and you can think about it, but that can’t be your one main thing to get to where you need to be.”
The ways professors have been introducing AI art in the classroom varies between classes and disciplines. Song said she’s teaching a drawing class in which students are supposed to derive inspiration from nature and the physical world, hence her AI art policy. On the other hand, Kurt Ralske, a digital media professor and department chair of media arts at Tufts University’s School of the Museum of Fine Arts, is taking a different approach.
“Personally, I’ve been encouraging students to explore this. I think they should know what the tools are, what they’re capable of and maybe develop a personal vocabulary of how to use them,” Ralske told Motherboard. “But we really are overdue for actually maybe having a larger discussion within the university of how we should handle these things.”
Doug Rosman, a lecturer in the Art and Technology Studies department at the School of the Art Institute of Chicago, is also having students explore the generators in his machine learning class. But, in his professional practice class, a more career-focused course, AI art and its impact on working artists is a different discussion.
“In that context, the outputs of DALL-E and Stable Diffusion feel more threatening,” Rosman told Motherboard.
Instructors aren’t the only ones thinking about the products of AI art generators. Art students are also dealing with the effects of AI art saturating the market for artists and what that could mean for their careers.
“The way that artists are embracing crazy capitalist, hyper-technology culture is just really disheartening,” said Marla Chinbat, an art student at the University of Illinois-Chicago. “I wouldn’t be surprised if AI art actually begins to hold merit because of a side of the art world that I don’t align myself with.”
None of the instructors or students at the institutions interviewed by Motherboard said their department or school had issued AI art guidelines or a policy for using AI art generators for projects. Charlotte Belland, a professor and chair of the animation program at the Columbus College of Art & Design, said setting parameters is left to individual instructors depending on the topics and concepts being taught in class.
“As long as they establish what their parameters are, then that’s an open forum to be able to either use or not use AI technology,” Belland told Motherboard.
However, learning how these programs work and how to help students use them takes time and effort on behalf of the instructor. If an instructor is not already familiar with machine learning or computer science, navigating the ways AI-art generators are shaking up the art world and understanding the algorithms could take extra work.
“Teaching is hard. It’s so much work and it’s not well compensated,” Rosman said. “It’s not fair that a small demographic of people in Silicon Valley can just throw this thing out into the world, and we’ve got to just run around picking up the pieces.”
Even if their instructors have not brought up AI art in classes, students are still thinking about how AI art generators are affecting the art world. Susan Behreds Valenzuela, an art student at NYU Steinhardt, said the subject has only come up once in just one of her classes, but would be interested in further discussions in other classes.
“I do wish we had talked about it a little bit more,” she told Motherboard. “But at the same time, I think in order for that to happen, my professors would need to kind of know a little bit more about that type of technology, and I just think it’s not something they’re really focused on.”
Students are also thinking about how they could use these tools as part of their processes. Rhode Island School of Design painting student Julia Hames said they played around with the AI generator Wombo for inspiration.
“For a while, I didn’t have any ideas of what to paint, so I’d just put in random prompts into Wombo to see what it created,” Hames told Motherboard. “I didn’t really like anything, but maybe it could be used for that because the images are so absurd and it just lets you into this uncanny valley that honestly humans can’t even get to sometimes.”
Song, Ralske, Rosman, and Belland all said they have not had students use AI-art generators for projects without their knowledge. If a student did use AI for a project, the way they used it was clear to the instructor. Belland said that if a student did try to use AI without consent from an instructor, being in a community with diverse perspectives and skills would help catch it.
“The nice thing about an educational community is that you have so many eyes on a project,” she said. “Even when a student makes an unfortunate decision to copy something in just a very traditional method, plagiarism, it’s pretty easy to spot.”
As for Song, she is also not too concerned with her students passing off AI-generated images as their own because she is already familiar with their work. She’s more worried about the students she hasn’t even had in class yet.
“In admissions, these new students are coming in from high school, from another life that we don’t know,” she said. “I think it could be possible for them to have created a portfolio out of thin air overnight using these generators, depending on how good they become.”