[ad_1]
SNA NEWS: Navy Exploring AI for Contested Logistics, Mission Planning
iStock illustration
ARLINGTON, Virginia — The Navy has been researching the benefits and pitfalls of artificial intelligence for decades, and while the spotlight has surrounded systems like unmanned ships and flying robots, AI could also bring significant advances in logistics, asset management and scheduling.
Alexandra Landsberg, division director of Mathematics, Computers and Information Science with the Office of Naval Research, redirected the spotlight to a more administrative use for artificial intelligence during a panel discussion at the Surface Navy Association’s 36th National Symposium Jan. 11.
While information processing perks such as analyzing vast amounts of data and creating summaries are known advantages of AI, perhaps a less talked about and less explored potential is in mission planning and contested logistics, she suggested.
With sensors collecting more data than ever before, “now, given the capabilities we have with hardware advances, software advances, we can bring in, take into account, contested logistics,” she said.
Operational problems presented by logistics bring an opportunity for dual use artificial intelligence, she said. And they can learn from some big commercial players.
“Let’s go to Amazon or FedEx. Amazon has warehouses, there’s robotics in it, they know exactly what products are in there, how to get those products to the delivery people. They know how to optimize the scheduling of all of that.”
Optimizing resources and planning is part of what Landsberg called a critically important goal of the Navy: readiness.
“What if we could go and take those AI approaches in industry from the Amazons or FedExes of the world and modernize our shipyards out there, knowing exactly what parts we have, what the right people are, in the right place at the right time. All of this will come together and really optimize the availability of our fleet.”
Another area AI could administratively assist is through large language models like ChatGPT, she said.
“We want to search large documents. We have a lot of large documents out there. We want to go and produce summaries. We want to accurately produce forms on that,” she said.
But a much-discussed challenge of utilizing generative AI within the Defense Department is security and trust.
“The challenge is that the ChatGPTs of the world are developed in the open, and information goes back there,” she said.
What that means is in order to take advantage of generative AI, the Navy needs to develop its own environments that are secure and ensure that the data being trained on is secure, she said.
Ensuring trust and understanding means AI must work in tandem with humans, she added. Trust requires understanding, and understanding requires training.
“All of this really depends on the human. It’s really the human and an AI system working in tandem,” she said. “It’s ensuring that the operator out there trusts and understands the benefits of these AI recommendations, but also understands the limitations of these recommendations out there.”
Test evaluation, verification and validation are needed, she said. In addition to experimentation and simulation, applications need to “go test in the real world. That’s absolutely critical.”
“There’s a lot of solid mathematics and research that can give you guarantees, and that’s an aspect of this. And let’s not forget, the researchers can help the Navy provide guarantees of services.” With guarantees, the Navy can then take the experimentation to scale, she said.
Regardless of what AI is utilized for — from unmanned systems to logistics and planning — it must reach across a spectrum of science and technology, basic research and technology demonstrations and experimentation, she said. But it can’t get stalled in experiments — it needs to become operational and it needs to be scaled up to the fleet.
“And that’s where we come together,” she said. It has to go beyond scientists in their labs. “We have to be out there in the fleet testing it out early and often. And then we have to be able to scale up. What does it require? It requires Navy personnel, both on the military and civilian side, to understand AI.”
Landsberg said the AI ecosystems will require a partnership between military, industry and academia “to be able to provide some rigorous measures and guarantees to us on AI. So this is really where I see the AI ecosystem going and how all of us need to work together.”
Topics: Robotics and Autonomous Systems
[ad_2]
Source link