daily californian logo

BERKELEY'S NEWS • SEPTEMBER 21, 2023

Apply to The Daily Californian by September 8th!

To what extent will AI impact arts, humanities

article image

SOFIA LIASHCHEVA | STAFF

SUPPORT OUR NONPROFIT NEWSROOM

We're an independent student-run newspaper, and need your support to maintain our coverage.

APRIL 10, 2023

Editor’s Note: This interview has been edited for length and clarity.

Artificial intelligence, or AI, channels human desires and imagination into creative output. This troubles creatives — writers, musicians, artists, and the like — as technology imposes new conventions in how they do their craft.

In the past months, creatives have grown more concerned about how AI is changing the industry and threatening their job security to the extent that creative roles might become obsolete. New technologies upset creatives as they feel robbed of opportunities with AI generating illustrations on their behalf. People who use AI to write literature and business communications also pose a threat for individuals who write for a living. 

The Daily Californian sat down to interview Coye Cheshire, a sociological social psychologist and professor at the UC Berkeley School of Information. Cheshire holds a doctorate degree in sociology from Stanford University and studies technology-mediated interactions and human behavior toward social networks. He offered his perspective on our current relationship with AI and information technology during our conversation.

The Daily Californian: What can you say about people’s current relationship with technology and our dependence on it?

Coye Cheshire: In our program, we talk about information technologies. Today we think of mobile devices, and we think of self-driving cars and AI. But it also includes things that go back a long time: bicycles, organizations of how we sit in a classroom, and the organizations of human beings in different ways and configurations. These are all technologies that we create and use. 

So thinking about our connection to technology, whether it’s artificial intelligence or whether it’s talking about the printing press, they’re part of the same set of issues about how people adapt and adopt.

DC: To what extent is it problematic to use technology to mediate human activity or interaction? Where do we draw the line?

CC: I think of technologies as a wide swath of things that go back hundreds of years — thousands of years even — when even cave paintings would be considered information technology for sharing information. 

Any way that people communicate, write down or transmit information, even if it’s just through sign language or whether it’s through a painting or a computer program, we use tools to convey information to other people. We also use tools to convey information to ourselves at a later point in time, like we use our electronic calendars or paper calendars to remind us of things. So those are all technologies. 

That relationship, for me, isn’t, by definition, problematic. There can be problems that might come up, or we might identify some problems that may occur with different technologies.

DC: How would you say technology intersects with the arts and humanities?

CC: Other fields like photography or music have already been dealing for quite a bit of time with developments of artificial intelligence. The way I’m answering this is that I look to some of the other examples of where this has already been taking place, where AI for years and years has already existed in areas in photography where if you’re doing editing of photos, for example. AI tools like chatbots and Chat GPT are making all kinds of headlines because people are like, “Oh my gosh, how are people going to write anymore if the AI is writing for us?” 

Or, say, you’re a historian. Some might say, “Oh well, we do not need historians anymore because chatbots will just write our history for us using all the same information.” It’s just like the photography example: the tools, even the AI tools, are not replacing all of the skill sets that are required. What they’re doing is they’re simplifying. They’re yet another set of tools. 

… even the AI tools, are not replacing all of the skill sets that are required. What they’re doing is they’re simplifying. They’re yet another set of tools. 

DC: How does the development of these technologies affect how we appreciate art?

CC: I would draw on a similar example, like music tools. I play music as a side hobby. I’m not worried that I will never be able to write my own song ever again because I can do these things [with AI]. What I look at is the ability to use AI to spur creativity, and help me think of connections between things in a way I haven’t before.

Right now, a lot of the fear that I have heard personally about ChatGPT and other kinds of newly popular tools is that there’s a concern that students won’t learn how to write anymore or that people won’t write term papers anymore, or that people who do writing for a living won’t have a job anymore. I don’t think that’s the case. I understand the knee-jerk reaction, but I also think that fear is very common across a lot of the examples that we gave in the past where lots of areas, businesses and arts thought that they were going to be replaced by technologies when, in fact, people just learned to use the tools in creative new ways. 

DC: Are there any ethical considerations people should be aware of when using AI technology?

CC: If you’re working on something, we would always acknowledge if we got an idea from somebody else.

One of my fears is that people who integrate AI tools think they don’t need to acknowledge that. So, for example, say you’re writing a paper for a publication. You’re on a deadline, and you use ChatGPT to write a couple of paragraphs for your paper, and you [think,] “No one will know. It’s not cheating; I wrote most of the paper. ChatGPT just filled up some of it.”

That’s not what I think of as ethically sound because people believe that the author’s name is the author of the paper. So, if the author isn’t actually writing the ideas and is not making the connections; but they’re relying on something else. If it was another person writing those paragraphs ethically, we would put them as a co-author on the paper, right? 

DC: It is very intersectional when you talk about AI, as well as the humanities, the ethics and the appreciation of art. Now, I understand that you’re also a sociologist, so from a cultural standpoint, how does the high and popular culture debate apply to AI-generated arts and literature?

CC: It’s a valid concern. Some might argue, “Oh my God, we don’t need musicians anymore. There will be a glut of AI-generated photography, AI-generated videos, and AI-generated music.”

Here’s the thought experiment that we could go on, but we may not enjoy it. Since we’re the ones who interpret what we like in terms of video, pictures or movies, or whatever, there are different things we’re looking for. 

We’re not just looking for what’s a pretty picture or what’s a song that sounds good. At that moment, we might want to hear an artist acknowledge their proficiency at an instrument. One could appreciate an AI-generated song, but then say, “Oh, there’s no real band there, so no one can play this song, or no one has played it yet. Maybe some people could learn how to play it, and that would be cool.” But all of those are just new configurations of how a tool can be used.

But all of those are just new configurations of how a tool can be used.

We do things in our brains, and we don’t even know how we connect certain dots. So the fact that we can teach computers like AI systems to help us connect things in ways we’ve never thought of before is exciting. I wouldn’t tell an artist that they’re going to be losing their job or that people aren’t going to value what they do.

I do think there’s a legitimate concern about how it can become hard to determine what has real social, cultural, historical and human meaning and what doesn’t. But I also trust people to come up with new ways to convey that information and distinguish what’s human and what was enhanced with the help of AI tools.

DC: I think this concludes our interview. Thank you very much for your time this afternoon.

Contact Chris Ceguerra at 

LAST UPDATED

APRIL 10, 2023