Avionics Digital Edition
Found inFeature

Artificial Intelligence Efforts for Military Drones

Companies are embarking on efforts to embed artificial intelligence (AI) on military unmanned aircraft systems (UAS), though the timing of full and effective autonomy for such drones is uncertain.

“While it is easy to imagine a future with significant embedded AI, i.e. online learning and autonomous decision making, deployed on large swarms of UAS, there are significant challenges to adopting non-deterministic learning algorithms on unmanned systems operating in real-world situations in collaboration with our customers,” said James McGrew, the chief technology officer for technology planning and the integration team lead for Boeing’s Insitu, Inc. “As such, we are leveraging machine learning and edge processing techniques to develop tools to enhance our ‘family of systems’ in ways that enhance operation without handing control over to ‘Skynet.’”

Through edge processing, drones do not have to send information to the cloud and thus may achieve greater performance, information security, and autonomy.

Last year, Insitu announced a new extended range satellite communications kit for the company’s Integrator drones and an Alticam-14 (AC-14) enhanced intelligence, surveillance and reconnaissance (ISR) turret with telescoping video imaging capability able to identify people from the air. The AC-14 payload has a number of other ISR features, including the ability of an operator to monitor electro-optical, infrared, and short-wave infrared video streams simultaneously. Insitu has also worked on an optional, laser designator upgrade for the AC-14 that could allow Boeing Apaches to team with Insitu Scan Eagle drones to allow Apache crews to remain out of enemy reach while firing Hellfire missiles.

The Insitu Alticam 14. Insitu

AI is to aid in analyzing the data provided by the AC-14 for the Integrator drones.

“It is within this [Integrator] family of systems that we plan to leverage the power of processing and mature AI techniques,” McGrew said. “The Hood Tech Vision AC-14 Imager payload uses embedded processing onboard for image stabilization and target tracking. Our Insitu Common Open-mission Management Command and Control (ICOMC2) and Inexa GCS software suites use processing on the ground to assist flight planning and sensor control. And our Tacitview/Catalina suite leverages server and cloud computing based processing of live and archived full motion video to extract information from sensor data. Our teams are working hard to bring more value to each part of our system – air, ground and cloud.”

A number of countries are pursuing the development and production of semi-autonomous and autonomous unmanned combat aerial vehicles (UCAVs) to serve as complements to manned fighter aircraft. The Royal Australian Air Force plans to fly a Loyal Wingman prototype early this year--a reduced radar cross section (RCS) UCAV flying at high-subsonic or low-supersonic speeds and providing support to manned fighters via the UCAV's weapons and data sharing. Part of that effort is Boeing's 38-foot-long Airpower Teaming System (ATS), which flew for the first time last November.

Boeing's 15-strong autonomy test bed fleet is testing out teaming, mission system and AI capabilities. The Queensland Government is supporting Boeing's autonomous systems technology development through its Advance Queensland partnership. Boeing Australia

Enabled by AI, ATS is a "modular and highly customizable aircraft with fighter-like flight capabilities," according to Boeing, which envisions ATS as enabling manned-unmanned teaming in which the unmanned system could "complement and support a specific threat-based mission."

The United Kingdom is also looking at such a manned-unmanned concept through its Project Mosquito, and the French and German governments are examining the concept as well for the Future Combat Air System (FCAS) program. The concepts may involve a swarm of lighter UCAVs working in tandem with fighter aircraft.

The United States Air Force, for its part, is undertaking a Skyborg initiative that uses AI to control low-cost UAS to aid manned aircraft, such as the Lockheed Martin F-35. One key participant in the initiative is the 30-foot Kratos XQ-58A Valkyrie, which has flown three times and which advertises a cruise speed of more than .7 Mach. The Valkyrie is designed for rail take-offs and parachute landings and thus does not require a runway or aircraft carrier. Kratos said that its drones are also made with affordability in mind, as they cost between $1 million and $3 million per copy.

Also in the U.S., the Defense Advanced Research Projects Agency (DARPA) is engaged in UAS AI efforts, including the Offensive Swarm-Enabled Tactics (OFFSET) program to equip soldiers fighting in urban areas with swarms of up to 250 UAS and unmanned ground systems. Companies, such as Northrop Grumman and Raytheon, are taking part in the program. DARPA is also leading a research effort with California-based AeroVironment to study how the military can learn from the mechanics of insect flight to increase UAS autonomy by reducing the computation required for AI.

California-based Aitech said that its A176 Cyclone and A178 Thunder supercomputers can provide generous AI for military drones, as the systems are ruggedized to military specification and run on parallel NVIDIA general-purpose computing on graphics processing units (GPGPU).

AITech's A178 Thunder supercomputer can enable AI for military drones.Insitu

“Nowadays, there are many companies developing military drones, with almost every avionics manufacturer having established a drone team or department,” said Dan Mor, the GPGPU and graphics product line manager for Aitech.

“The demand for military unmanned aerial vehicle (UAV) programs is growing continuously, meaning drones are also in demand,” he said. “These systems typically include some type of image or graphic processing, for example, starting from capturing the images and video and ending up with complex real-time mapping and navigation systems. The typical processes for image classification, location and segmentation include but are not limited to pattern or object recognition and identifying classes; locating and extracting image coordinates, finding where in the video those objects are located; and locating object boundary lines, curves, etc. in images and video.”

Mor said that “image classification, image location and image segmentation applications are perfect candidates for deploying NVIDIA deep-learning inference networks, since they can benefit from hundreds of parallel CUDA processor cores calculations.”

CUDA is a parallel computing platform and programming model developed by NVIDIA for GPGPUs.

“The military drone market is always looking for small form factor and size, weight and power (SWaP) optimized systems, so high-performance, compact GPGPU-based systems offer an ideal set of characteristics for these types of applications,” Mor said.

While the timeline for nations and groups fielding ubiquitous drone swarms is uncertain, the United States military is taking no chances and is testing high-energy lasers and high-powered microwave weapons to defeat enemy UAS that may pose a threat to U.S. bases and forces. Last fall, for example, the United States Air Force awarded Raytheon a more than $16 million contract to develop the Phaser high-powered microwave weapon to eliminate such swarms.

The service is testing the weapon and other-directed energy weapons in the wake of attacks by multiple drones against Saudi oil facilities last September—attacks that the Yemenese Houthi movement said that it conducted in response to Saudi intervention in the Yemenese civil war.

Mercury Systems, Inc., a Massachusetts-based aerospace and defense embedded systems supplier company, said that it recently introduced the Ensemble Series HDS6603B and HDS6605 next generation, rugged OpenVPX server blades that may serve UAS AI applications.

“Powered by Intel Xeon Scalable processors, these high-performance embedded edge computing (HPEEC) server blades are ideally suited to embedded ML and AI applications at the tactical edge,” according to the company. “When combined with our GPU co-processing engines and PCIe switch modules they produce a truly composable data center processing environment for embedding in platforms deployed in the harshest environments.”

Mercury Systems executives see a number of emerging AI trends, based on the wish list of the company’s military customers.

Such trends include the following:

  • Rugged, embedded processing power to augment EO/IR sensors.
  • Next generation radar and electronic warfare systems with cognitive/AI processing capability for assessing previously un-encountered threats.
  • Multi-functional apertures with sensor fusion and increased processing power to perform more sophisticated on-board sensor processing.
  • Lower size, weight and power (SWaP) for longer, further, higher missions and for use of AI on smaller platforms.
  • Interoperability, scalability and affordability of low-risk modular open system computing approaches/architectures that leverage technology reuse.
  • Mission-critical effector systems (e.g. avionics and vetronics) with flight-safety certification for deterministic critical system operation.

“Overall, we are seeing digital convergence,” Mercury Systems said. “Multiple platform sensors and their processing chains are being recomposed around a greatly reduced number of processing nodes. This is a model taken from the commercial domain¬¬–smart cars and urban air mobility.”

Beyond the use of AI for ISR UAS, AI may enable autonomous UAS that serve as weapons, and, for the latter uses of AI, technological obstacles are not the sole ones. As dozens of defense companies seek to use AI to develop lethal autonomous weapons (LAWS), humanitarian groups seek to build international support for a treaty to ban them.

LAWS, so-called "killer robots," would rely on AI to remove the human from targeting decisions.

The United Nations Convention on Conventional Weapons (CCW) has been discussing LAWS' concerns and is to start devising a "normative and operational framework" for such weapons at meetings in Geneva on June 22-26 and August 10-14. But humanitarian groups are frustrated that CCW has not progressed further in its work on the issue and that CCW has not brought up a legally binding document to stop or significantly restrict LAWS.

In a message to CCW's Group of Government Experts convened for a meeting on emerging LAWS technologies last March, U.N. Secretary General António Guterres wrote that "autonomous machines with the power and discretion to select targets and take lives without human involvement are politically unacceptable, morally repugnant and should be prohibited by international law."

"This reflects what I see as the prevailing sentiment across the world," Guterres wrote. "I know of no state or armed force in favor of fully autonomous weapon systems empowered to take human life."

Regardless of humanitarian groups efforts though, it is possible that AI and machine learning (ML) may one day find their way to almost all advanced military drones, according to Insitu’s McGrew.

“AI and ML are generic terms for a wide variety of data processing, control, and optimization techniques applicable to almost any industry or system,” McGrew said. “What was considered ‘AI’ in the past, is considered a software application today. I’d say the possibilities are for the full spectrum from Group 1 UAS, for example small fixed wing and multi-copter UAS, through Group 5 UAS, such as Boeing’s Air Power Teaming System. UAS will continue to adopt more advanced technology, leading to further autonomy and allowing human operators to provide more high-level input and supervisory control.”