The news site of Santa Barbara City College.

The Channels

The news site of Santa Barbara City College.

The Channels

The news site of Santa Barbara City College.

The Channels

The effects of artificial intelligence access among City College students

Cebelli Pfeifer
According to Pew Research, one in five teens who know of Chat GPT use it for schoolwork. This amounts to a total of 13% of all U.S. teens who have used artificial intelligence in school.

Rapid advancements in technology have sparked continuous discourse about integrating artificial intelligence (AI) into the ever-evolving landscape of education.

At the heart of this discussion lies questions about when AI is ethical to use and when it is not. What distinguishes using AI as a helpful tool that teachers and students should embrace from a tool that undermines creativity and intellect?

On the City College website, one can find an explanation of Chat GPT, one of the most widely used AI tools in educational settings. There is also information on how to use AI detection software, broad warnings on why students should or should not use AI, and other bits of information regarding the usage of AI in the classroom.

Whether composing a ten-page paper on the civil rights movement, writing a speech built on personal anecdotes or solving calculus equations, it remains evident that the treatment of AI differs from course to course. Faculty and staff are currently grappling with establishing a concrete idea of when AI is encouraged and prohibited in their curricula. 

“It’s a really hard balance for teachers right now,” said Elizabeth Imhof, the dean of English, fine arts (humanities), and social science. “We really have to strike [a] careful balance that supports our students to think for themselves, use their minds to discern, and at the same time, use it as a tool. And I think that there really is a danger in it.”

This “danger” Imhof refers to could mean a multitude of things. When it comes to AI usage in educational settings, there are concerns about students not developing the necessary skills they need to succeed independently, such as critical thinking and creativity skills. One of the main concerns, however, is undermining academic integrity.

“So much of the faculty’s effort at the moment is about trying to curb academic dishonesty,” Psychology Department Chair Joshua Ramirez said. “Right now, it’s the wild, wild west that we’re dealing with. But can we just have a policy that says under no circumstances? No.”

While acknowledging that AI systems are not flawless, Ramirez emphasized the importance of embracing AI and equipping students with the necessary skills that could come in handy as AI develops. 

“Students deserve to know how to use this properly,” Ramirez said. “The AI’s aren’t perfect. But it can be very helpful. They can be a tool for democratizing education.”

The benefits made possible by AI usage include fast-paced language translation accessibility, personalized learning, and access to resources. These attributions contribute to the discussion around democratizing education by providing high-quality learning resources in the classroom that anyone can access, regardless of one’s background.

“For that student for whom English is not their primary language,” Ramirez said. “Or for that student that doesn’t know how to start off an essay, AI can be helpful in being able to model what writing should look like or what English should sound like.

“At the same time, we also have to teach students that AIs are a start–they don’t know how to think like we do.”

However, it remains to be determined when AI should not be encouraged for use in the classroom. Since the resurgence of Chat GPT, AI detection software has become popular among educators to combat students using AI to cheat. This can lead to various problems, such as inaccuracy and false detection of plagiarism, possibly leading to students being wrongly accused of cheating.

“AIs are not going away, and detection is unreliable,” Ramirez said. “We cannot use that as the test for whether or not students are being academically dishonest or not.”

Whereas plagiarism is concretely known to be a form of cheating, students and teachers struggle to identify the boundary between taking advantage of AI to cheat and using it ethically.

“I do believe some of the time, students very clearly, for many different reasons, are choosing to violate that boundary,” Imhof said. “But I also believe that many students really don’t understand the boundary because it’s confusing.”

The distinction between cheating and legitimate academic practices when using AI is blurred, leaving students, in many cases, to self-determine whether their AI usage is ethical due to a lack of clear guidelines.

“As a student, I think it has the potential to be a helpful tool,” student Paul Dodson said. “We just need to better understand how to use it well and not cheat.”

Whether educational settings choose to embrace or prohibit AI in the classroom, providing clear guidelines on its appropriate implementation may be the best solution for maximizing its benefits, such as democratizing education.

“I think it’s here to stay,” Imhof said. “We need to learn to embrace it and use it to help support learning in the education of our students. I encourage faculty to use it and come up with really strong, solid guidelines about what that means.”

Story continues below advertisement
More to Discover