SHREVEPORT – Generative artificial intelligence and its impact on higher education.
These two worlds have been on a collision course since ChatGPT exploded in popularity in late 2022.
But LSU professor Dr. Andrew Schwarz, who visited LSUS as the keynote speaker for the university’s Campus Kickoff on Thursday, believes generative artificial intelligence and higher education don’t have to be at odds.
Schwarz, a professor in the Stephenson Department of Entrepreneurship and Information Systems, told assembled LSUS faculty and staff that AI can be a tool to enhance learning and research by students and faculty.
“I asked ChatGPT to create a bibliography on technology adoption, and I wanted to see if it could point me to the seminal works on this topic,” Schwarz said. “It did produce some seminal works, but it also pointed me toward works that I had never heard of, some that happened to be really good.
“ChatGPT pointed me in directions of research that I hadn’t thought about before.”
One main pitfall concerning generative AI in the classroom involves students turning in works that aren’t of their original creation.
But Schwarz is using AI prompts as part of his curriculum, shifting away from a traditional model of writing research papers outside of the classroom.
“We’re having students generate an AI prompt, then evaluate what AI is producing,” Schwarz said. “We have them point out where the AI was wrong.
“Having reflective writing assignments in class is one way to incorporate writing and AI in your curriculum. There are ways to use it as an exercise or a tool.”
LSUS has a committee dedicated to developing best practices around AI use.
Some LSUS instructors use AI regularly in their classrooms, including history professors who employ AI similar to Schwarz’s example as a way to avoid students turning in AI-generated work.
These assignments require students to have command of the subject while also critically think about and analyze that subject, which isn’t a strength of generative AI.
AI is a hot topic in the art world, but digital arts professor Jason Mackowiak had students use AI to generate a script in a video editing class.
“With this being a digital video editing class, I wanted my students to focus on editing skills specifically instead of spending time to write a script or do a voiceover,” Mackowiak said in a December interview. “We’ve developed a prompt with a few descriptives, and we ran those prompts multiple times and chose the best script for each project.
“We removed parts of the script that aren’t actual dialog and then used AI software to generate voiceovers.”
Schwarz said he understands the balancing act faculty must consider when thinking about AI.
“Some may be OK with using AI to brainstorm or to structure tasks,” Schwarz said. “Whatever is decided for each specific class or department, setting boundaries in the syllabus about AI use and being transparent about when and how AI is used seems to be a fair practice.”
But Schwarz, who called himself a techno-optimist, knows there are downsides to AI use beyond student plagiarism.
AI models are trained on different data sources, some of which include posts from social media sites like Reddit and X (formerly Twitter). Some AI models also don’t disclose their data sources, which can lead to bias and inaccuracy.
“The technology itself is neutral, it’s how it’s used that is the issue,” Schwarz said. “We need to think about diversity, equity, inclusion, and bias in relation to what data sets a model is trained on, especially models that don’t declare their data sets.
“But doing things that universities already love to do – things like research, education, outreach, collaboration, critical thinking – these are things that can help universities become more resilient in a generative AI world.”