“We also know there is heavy investment by our near-peer adversaries in artificial intelligence and autonomy in general. We know that when you couple autonomy and AI with systems like low-cost attritables, that can increase capability significantly and be a force multiplier for our Air Force, and so the 2023 goal line is our attempt at bringing something to bear in a relatively quick timeframe to show that we can bring that kind of capability to the fight,” Tran said.
Matt Duquette, an engineer from AFRL’s Aerospace Systems Directorate, brings a background in UAV control, autonomy and modeling and simulation of UAVs, especially teams of UAVs to the effort while assisting the Skyborg program with formulating its approach to the autonomy system and some of the behaviors that the UAVs will have.
“Skyborg is a vessel for AI technologies that could range from rather simple algorithms to fly the aircraft and control them in airspace to the introduction of more complicated levels of AI to accomplish certain tasks or sub-tasks of the mission,” said Duquette.
This builds on much of the AFRL foundational work with AI shown with programs, such as Have Raider and the Auto Ground and Air Collision Avoidance systems, which prove that levels of autonomy in high performance aircraft are not only possible, but also practical.
“Part of our autonomy development is building assurance into the system. You can either build assurance by using formal methods or approaches where at design time, as you develop these autonomous capabilities, you guarantee certain behaviors, or a more practical approach is to assess the capabilities of these behaviors at run time, meaning while they’re running on the aircraft. So, those are the capabilities that we’re interested in looking at from the experimentation level to see what type of assurance you need in the system so you can mix high and low criticality.”
Patrick Berry, from AFRL’s Sensors Directorate, is supporting the Skyborg program by conducting modeling, simulation and analysis and said, “We’re looking at a range of vehicle performance parameters – mission analysis will help us determine what the final outcome is and the responses from the CRFI will help us understand what the performance is of currently available systems and whether those will meet the needs or not.
“Everything from keeping up with combat platforms to slower platforms for sensing. There will be a range of possibilities there,” he said.
Although Skyborg is not scheduled for any particular type of aircraft platform at this time, Tran said the CRFI emphasizes the importance of an open systems architecture, having modularity in the system not only from a sensing capabilities standpoint but overall mission systems as well as the autonomy associated with the mission capability for the platform.
“We’ve partnered with the 412th Test Wing at Edwards Air Force Base, California, and specifically an organization called the Emerging Technologies Combined Test Force and we’re working with them beginning with small, fast-moving UAVs to test the current state of the art in AI and autonomy in those airplanes and the ability for them to autonomously team and collaborate in flight,” said Tran.
“Machine learning has progressed greatly over the last few years and we’re very inspired by those results and excited by things that are going on in the gaming industry for instance,” said Maj. Ryan Carr, from AFRL’s Aerospace Systems Directorate.
“We expect that that technology will continue to mature fairly rapidly. What we really need to understand is, ‘How do you take that and do something like bring it to the real world and fly with it for example?’ The thing we’re trying to get at early on is how to do that safely. We’re talking about run-time assurance, working hand in hand with the flight test community who have a very long record of safe flight testing. That’s really what we want to focus our attention on in this early period,” Carr said.
“We want to do this in a way that builds trust in the system as you go along so that when you get to that EOC, you will have established a baseline of trust so that operational youth will believe what the system will do or believe it’s safe. It’s not just that end state capability, it’s the trust as you go along,” he added.
Before operational AI innovation can occur, the Air Force must field an autonomous system that meets an immediate operational need and can serve as an iterative platform to facilitate complex AI development, prototyping, experimentation, and fielding, and that system is Skyborg, the CRFI says.