推荐杏吧原创

Ask the Professor: Is ChatGPT an issue or a tool?

Instructional Designers Kathy Hanselman and Angie Chase discuss the place for AI in higher education

The inside of a computer showing circuit boards and wires

Ask the Professor: Is ChatGPT an issue or a tool?

Instructional Designers Kathy Hanselman and Angie Chase discuss the place for AI in higher education

The inside of a computer showing circuit boards and wires
Ask the professor
 Find more answers here!

, which stands for Chat Generative Pre-trained Transformer, is a language-based chatbot where the user puts in a prompt and then it provides a detailed response. ChatGPT, and many other programs like it, uses artificial intelligence (AI) technology. AI is being used more frequently by both students and professors in colleges and universities.

While some professors encourage the use of ChatGPT in their classrooms, many others do not and put a strict ban on using AI. Students have been incorporating AI in their coursework for myriad uses, including to get past writer's block and to get input on how to structure an essay. More nefarious uses include copying and pasting responses from AI tools like ChatGPT verbatim, which could be considered plagiarizing and academic dishonesty. There is a large spectrum on which students and professors fall on whether or not AI has a place in higher education.

At the 推荐杏吧原创 Kathy Hanselman and Angie Chase are part of a team that has created micro-learning events for faculty breaking down different questions and topics in 12 minutes. Hanselman and Chase are Instructional Designers with the Office of Digital Learning and have conducted these events called “Teach in 12” since the spring semester of 2023. This semester they have focused on technology in teaching, specifically with AI, to try and tackle some of the difficult issues surrounding its use by students.

“This year, we began to receive questions about AI and plagiarism from faculty,” Chase said. “We let their concerns guide us in choosing the topic for this fall semester’s Teach in 12 series. From there, we looked back to
Angie Chase stands in a striped shirt smiling at the camera
specific faculty questions that arose in the AI forums we held last spring. Using the information we gleaned along with articles and reports we read throughout the year about the use of AI in Higher Education, we then selected the individual topics accordingly.”

“People are very hungry for information in this area,” Hanselman said. “The idea with the Teach in 12 is: here’s a seed of information to get you started and to get you thinking and then you can take it from there, either in your own research and reading or coming and talking with us.”

Some professors worry about students’ abilities to learn the material in their classes due to misuse AI. If a student uses AI, or specifically ChatGPT, to do all their work, they won't learn anything and/or they will plagiarize entirely. While the concern is understandable, Hanselman says to embrace AI, because it’s not going anywhere.

“Our students right now are going to be working in a world where they will be using AI in their future,” Hanselman said. “It’s just how it is. People are already using it right now. Outlook, which is what most faculty at the University uses for email, auto-fills in emails using predictive text … that is AI. Grammarly is AI. It’s a little more generative AI and that kind of throws people off and scares them, but you’re going to be using it at some
Kathy Hanselman stand in a floral shirt smiling at the camera
point in your future. So, I think that what faculty are looking at now is helping students get a handle on how to use AI responsibly and, you know, use it creatively. Down the road everybody is going to be using it all the time, so we want our students to be able to use it the best.”

Generative AI, like ChatGPT and Grammarly, focuses on creating new content and data whereas more traditional AI, like Siri and Alexa, solves specific tasks with predefined rules. While there are many amazing things generative AI can do, like composing music or assisting in the discovery of new drugs and compounds, there are many ways in which it falls short.

“Generative AI can inherit biases from the data it's trained on, which can perpetuate and amplify societal biases and prejudices,” Chase said. “It can be used to create harmful or misleading content, including fake news, deepfakes, and offensive language.”

“They [AI tools] have tons of problems. Looking at ChatGPT specifically with generative AI, while it is getting better, it is not 100% trustworthy,” Hanselman said. “It’s basing its outputs on the language models it has been trained on. So, you see a lot of bias, and it's not that the tool is biased but the language it's been trained on is. There's a lot of racial bias, gender bias and things like that that we see.”

The first version of ChatGPT was released in November 2022 and is already on version 3.5 and version 4.0 for those who want to pay for a premium service. ChatGPT is getting developed and released faster than many of the technologies we’ve seen. AI is a constantly expanding field and no one knows where it will go next.

“I don’t think we should be afraid of AI,” Hanselman said. “I think all of us, students and faculty, should be doing our best to understand how best to work with it and realize that we do have a place in guiding AI. I think it’s important to use it responsibly and for us faculty members to teach our students how to do that.”

Latest From

Nevada Today