NEWS

Artificial intelligence and embedded computing for unmanned vehicles

The two most prevalent terms in military and civilian technology represented little more than science fiction a generation ago. But today, unmanned vehicles and artificial intelligence (AI) command center stage in any discussion of future military requirements for platforms, tactics, techniques, and procedures.

Unmanned vehicles, in the form of unmanned aerial vehicles (UAVs), arrived on the scene first, but how the military wants to use them and other platforms — unmanned ground vehicles (UGVs), unmanned surface vehicles (USVs), unmanned underwater vehicles (UUVs) and unmanned space vehicles (USVs) — in the future had to wait for at least rudimentary AI.

Each of those has its own unique operational environments that require specific AI capabilities to make autonomous underwater vehicles (AUVs) practical. Sandeep Neema, program manager in the U.S. Defense Advanced Research Project Agency (DARPA) Information Innovation Office (I2O) in Arlington, Va., says some of the most difficult unmanned technology challenges involve UUVs.

“While each evaluation environment is distinctive, undersea environments present a unique set of challenges,” Neema explains. “In these environments, things move much more slowly, missions can take longer due to harsh environmental conditions, and the limits of physics and navigation/sensing/communications issues exacerbate the challenges. Advanced autonomy could significantly aid operations in the underwater domain.”

Smaller, faster processors and enhanced onboard memory have expanded the capabilities of embedded computing greatly across the range of unmanned vehicles, but especially on smaller platforms like hand-launched UAVs, UUVs, and UGVs operating underground.

“Big data processing is increasingly being deployed in edge applications for autonomy, quick reaction capability, and untethered cognitive functionality remote from fixed resources,” explains John Bratton, director of product marketing at Mercury Systems in Andover, Mass. “Nowhere is this more pronounced than in the rapidly emerging and well-funded autonomous platform domain.”

To scale the data center across smart fog and edge layers requires their composing servers to become smaller, resilient to harsh environments, and human attempts to tamper with them. Distributed deployment requires servers to be miniaturized, while remaining well-cooled; protected from hostile environments and conditions; secure and resilient to reverse engineering, tampering, and cyber threats; trusted across hardware, software, middleware, and other intellectual property; deterministic for mission- and safety-critical effector control and edge layers; and affordable through the leverage of the best commercial intellectual property, independent research and development, and manufacturing capabilities.

Full article:

Hood Tech Vision, gimbal manufacturer, logo mark
Let's Talk
Contact us to discussion your platform and requirements. We can also send full specifications for all of our gimbals that best fit your application.