I/ITSEC NEWS: Generative AI Pushing Cognitive Shift in Training
iStock illustration
ORLANDO — When discussing generative artificial intelligence and language models such as ChatGPT, talking points often focus on the elimination of jobs and privacy concerns, but experts suggest AI will not steal jobs, but simply create different ones — and change the way humans think about accomplishing them.
Particularly with training and education, the rapid evolution of language models both large and small — which constantly learn from data ingested from humans — is prompting instructors across military and academia to rethink how they teach.
Andy Van Schaack, associate professor of the practice of engineering management at Vanderbilt University, said he not only assumes his students are using artificial intelligence, but demands it.
“I actually require that my students use generative AI in their schoolwork because I believe that every single one of them is going to be required by their employers, and by necessity, to use that upon graduation,” he said during a panel discussion at the National Training and Simulation Association’s Interservice/Industry Training, Simulation and Education Conference Nov. 30.
“So why not get started right now?” he added.
In contrast to cautious skeptics of AI’s growing relevance, Van Schaack championed its possibilities, saying there “really isn’t anything that I don’t use generative AI to help me with anymore.”
“I believe that we’re on the threshold of the most significant technological development in the history of civilization,” he said. “And so, I spend hours every day just trying to keep up with what the technology allows us to do.”
While Van Schaack voiced his support for embracing AI for good, concerns arose about an overreliance on its capabilities stunting critical thinking in students and trainees. The answer looked different between academia and the military.
From an academic perspective, one assumption is that “if we allow students to use generative AI, that system will do all the homework for the students,” Van Schaack said. “And sure, they’ll get great papers, but they’re not learning anything.”
This can be avoided by simply changing the assignments and requiring the use of AI, he said.
“So, my students can’t cheat and use AI because I require that they use AI. So when other professors say you can’t use this technology, of course the students will use the technology, and that just turns into one honor code violation after another.”
The trick, he said, is to take specific instructional objectives that were previously achieved through, for example, essays or presentations, and change the assignment to arrive at the same objective, but is supported by AI.
He argued this approach actually requires students to do more of the “kind of thinking that leads to better understanding.”
That remains a challenge to pursue, however, he said. “And that’s essentially my project I’m working on right now, are those specific activities that you can use in your classroom to replace the old activities that can be done entirely by generative AI. So, I have that concern as well.”
From a military perspective, the outlook for critical thinking skills was slightly less optimistic.
Keith Brawner, program manager at the Institute for Creative Technologies — a Defense Department-sponsored University Affiliated Research Center working in collaboration with the Army — said the military’s results-oriented mindset could make it more susceptible to a critical thinking vulnerability when utilizing AI in training.
“Say what you want about the military, but they are results oriented,” he said. “If all you get is a bunch of A-plus papers, and the people aren’t really thinking, but only accomplishing all their objectives, hey, that’s success.”
When blending the worlds of simulation and reality in training, an overreliance on the systems being trained on versus the real world is something to be concerned about, he said.
“We certainly have an over reliance on some systems, notably things like GPS,” he said. “ You can have some amount of a reliance on a system if you can have some dependence on it being available at time of need.”
As an example of a successful balance, Brawner used experience working on a submarine hunting and prosecuting system that used machine learning components to automatically analyze sound and data from acoustic signatures.
“That was also used in the real world because it was not as good as the human, but it wasn’t bad at finding submarines,” he said. “Did the students learn an over reliance on the system? No, they learned how to prosecute real submarines and to use the system in order to prosecute submarines.”
There are advantages to infusing training systems with artificial intelligence and machine learning capabilities, as long as they are not relied upon exclusively to accomplish objectives, he said.
The key will come down to how instructors think about and change their approach to education, Van Schaack reiterated — across academia and the military.
Van Schaack is currently working with the Navy on applied cognitive psychology for the classroom — asking naval instructors how to get students to process information in ways that allow them to acquire, retain and ultimately retrieve that information in the future, he said.
“And for me, by bringing generative AI into the activity, you don’t just hand the assignment off to AI and then turn in the results. If you change the assignment correctly, you can get students or Navy and Marine Corps officers to use that information and use AI to process it more deeply.”
“We as teachers, and as trainers, have to change what we’re doing, what we’re asking people,” he continued. “But I think we can ask for more — more complex, deeper thinking than we have ever gotten before. And to me, that’s very exciting as an instructor. [Students] feel like they’re cheating, but I’m getting more, better work out of them and they’re thinking more. It’s kind of a win-win.”
Topics: Training and Simulation