North Dakota higher ed leaders form task force to combat negative effects of artificial intelligence
The North Dakota University System has created a task force to study the subject.
GRAND FORKS — Higher education leaders across the state are convening to tackle the subject of artificial intelligence on campuses, with a renewed focus following the release of ChatGPT software.
ChatGPT, or Chat Generative Pre-Trained Transformer, is a program developed in November 2022 by the research laboratory Open AI. It allows users to pose questions, with an algorithm generating responses.
In order to devise strategies on how best to combat the negative effects of artificial intelligence, such as the potential for academic dishonesty, as well as discuss the potential for further applications in curriculum development, the State Board of Higher Education has formed a task force consisting of all heads of NDUS institutions.
Mark Hagerott, chancellor of the North Dakota University System and former professor of cybersecurity at the U.S. Naval Academy, said the State Board of Higher Education has been discussing the implications of artificial intelligence for years. Student data protection and privacy rights have loomed large in these discussions, according to Hagerott.
“Before COVID, the board was considering adopting a digital bill of rights for students, to address issues such as privacy and data protection,” said Hagerott. “We’re reviving those discussions now, with a bill in the Legislature pertaining to the state longitudinal data system. About seven years ago, our data was potentially going to flow out to vendors, and we realized we needed to build our own data set with strong protections. That bill would keep data from our students in the state, using versions of AI that we purchase from companies to conduct our own analytics on.”
Hagerott said the world of artificial intelligence is a nuanced one, in which many applications play a critical role in instruction on NDUS campuses.
“People think of AI as just a black-and-white concept,” said Hagerott. “Under AI, there are two main concepts — artificial specialized intelligence, and the big recent breakthrough of artificial generalized intelligence. We’ve had several applications of artificial specialized intelligence for a number of years. We have a program called the Dakota Digital Academy, a consortium of the NDUS campuses promoting courses on cybersecurity and the role of AI. We had a speaker from Eide Bailey (LLP) tell us they require a digital analytics certificate or minor to get hired as an accountant, because you have to have the tools to operate increasingly powerful computers.”
Other artificial intelligence applications used within the university system include the development of robotics technology in the John D. Odegard School of Aerospace Sciences, and the Grand Farm Research and Education Initiative’s partnership with North Dakota State. According to Hagerott, the partnership explores the role of predictive analytics in developing more precise agricultural practices.
UND President Andrew Armacost said that although ChatGPT is not being used in an official capacity at UND, it has made inroads among students and faculty due to its ability to answer complicated questions.
“ChatGPT quickly generated significant buzz around our campus,” said Armacost. “I've experimented with it myself, and its responses can be quite sophisticated."
While UND is not officially using ChatGPT, Armacost said the university uses other artificial intelligence platforms, such as ChatBot, as a remote question and answer service.
“Our website uses ChatBot to answer admissions questions prospective students may have,” said Armacost. “A lot of students prefer it to human interaction, because it removes the element of bias. Students don’t feel like they’re being judged when asking their questions.”
Hagerott said the introduction of ChatGPT raises concerns about academic integrity on NDUS campuses.
“ChatGPT has the potential to affect students and their submission of written work,” said Hagerott. “There are even cases where ChatGPT’s answers passed an MBA exam. I think it’s really a race against the software when it comes to preserving academic integrity. For example, if you have a weak program writing essays, it’s not hard to develop a stronger program to detect it. If ChatGPT keeps getting stronger, faster and better, with millions of people using it to write essays, it begins to outstrip the counter systems.”