Faculty Panel Explores Artificial Intelligence’s Role in the Classroom

Honors Student Government and the Artificial Intelligence Society co-host panel to discuss implications of AI in education.

Honors Student Government and the Artificial Intelligence Society hosted a teacher panel about Artificial Intelligence (AI) and academia Oct. 14, featuring six different professors from the College of Arts and Sciences and the Quinlan School of Business. 

The discussion began with a definition of artificial intelligence as computer systems which can perform tasks normally requiring human intelligence. They also discussed Loyola’s AI policy.

Fourth-year political science and psychology major and president of Honors Student Government Marco Alvarado pitched the idea to administration. 

“AI is rapidly evolving, and I’ve noticed academia is still trying to catch up,” Alvarado said. “That’s what inspired me to create this panel, to have a diverse range of perspectives on AI.”

Fourth-year neuroscience and mathematics major Avery Boland said conversations around AI may be difficult, however, it doesn’t mean people are incapable of having them. 

Boland said to ensure disagreements don’t devolve into harmful conversation, tools should be used like respectfulness, knowing how to approach disagreement and good communication skills.

Associate Director of the Writing Program and English professor Julie Chamberlin said she advocates against an “abstinence-only” approach to AI discussion. 

“When you tell people not to use it, they just end up using it the wrong way,” Chamberlin said. “I’d rather empower students with the knowledge to understand where AI falls short of human creativity and intelligence.”

Fourth-year information systems and analytics major Jillian Rossman is the president of the AI Society. She said in the student organization, she and her fellow officers emphasize nuance when talking about AI.

“Having nuanced discussions is way more important than just saying ‘AI sucks,’” Rossman said. “Obviously, we have a slant towards AI, but we also have a lot of discussions about bias, environmental impact and ethics.” 

Alvarado focused his questions in the panel around how different fields can improve AI usage and how students should use it going forward.

Computer science professor Leo Irakliotis said universities must recognize they’re very conservative when it comes to technology. 

“Institutions are usually one, two, even three generations behind their students in adopting new technology,” Irakliotis said. “Sometimes faculty can be even further behind than the institution itself.” 

Boland said professors are unable to see the benefits of AI and tend to take a cautious approach.

“It can be an extremely valuable educational tool, especially beyond college,” Boland said. “A lot of public-school kids have huge class sizes and don’t get individual help.” 

Boland said that when tutoring costs money most families don’t have, ChatGPT is a powerful tool. 

If kids need to hear something in a different way than teachers can offer, ChatGPT can reiterate the message in a phrasing that makes more sense, Boland said.

Rossman said when people talk about AI, it’s important to think of communities outside of the U.S.. 

“There have been multiple studies showing countries in Africa where students jumped multiple grade levels because of AI,” Rossman said. “For people who don’t have basic access to education, AI can make a huge difference.”

First-year political science major Tess Tchorbadjiev said while she does use AI at times, she believes students need to remain dependent on their own thinking. 

“If we let AI think for us, we lose that independence and those critical thinking skills that are already starting to fade,” Tchorbadjiev said. “The more we depend on AI instead of our own brains, the less capable we become.”

Irakliotis said his biggest fear about AI is people ignoring the issue until it makes them irrelevant. 

Gutenberg invented the printing press, making monasteries irrelevant — despite their monopolization of knowledge just months before. Irakliotis said there’s still a window before irrelevance to advocate for future generations.

Data centers use water to generate heat and rely on upwards of five million gallons of water a day, The Associated Press reported.

“These models can’t do everything people claim they can, and that has consequences — environmental, ethical and practical,” Chamberlin said. “The energy they consume isn’t sustainable.”

Tchorbadjiev said to solve the issue of environmental harms, there needs to be a focus on sustainability. 

While it’s possible to reduce AI’s negative environmental impact, doing so is costly and demands major investment, making its potential benefits sometimes not worth the cost, according to Tchorbadjiev.

Chamberlin said there needs to be more collaboration between the tech industry and educators. 

“If there had been more collaboration early on, we could have created tools that supported learning rather than disrupted it,” Chamberlin said. “The ship has sailed — AI is here, and we need to deal with it.”

Topics

Get the Loyola Phoenix newsletter straight to your inbox!

Maroon-Phoenix-logo-3

Sponsored
Sponsored

Latest