Sonardyne and SeeByte Advance UMS Navigation and Autonomy in Challenging Environments

02.06.2021
Maritime defence technology company Sonardyne and uncrewed maritime systems (UMS) software experts SeeByte have been awarded UK Defence Science and Technology Laboratory (Dstl) funding to enhance and extend the future operational capability of autonomous and remotely operated systems in challenging battlespace domains.


Image Caption: Project Wilton Iver AUVs. Photo from SeeByte.




The collaboration is the second phase of the UK’s Defence and Security Accelerator (DASA)’s Autonomy in Challenging Environments competition and builds on the work both organisations undertook in Phase 1.

Sonardyne advanced underwater positioning system will be teamed with SeeByte’s adaptive, communication-aware, robotic behaviour developed for their autonomy system Neptune to allow the UMS to operate in highly complex, variable and communications-limited environments. Automatic target recognition imagery snippets will be transferred acoustically using SeeByte’s novel semantic compression software.

As part of the project, Sonardyne and SeeByte will be using surface and underwater assets from Project Wilton, a recently formed maritime autonomous systems (MAS) team based out of HM Naval Base Clyde. The collaboration will culminate in a series of in-water demonstrations at Project Wilton facilities in the UK.

Sonardyne will install a Mini-Ranger 2 underwater positioning system onboard Project Wilton’s ARCIMS uncrewed surface vessel (USV) and AvTrak 6 Nano telemetry and tracking transceivers to the team’s Iver 3 autonomous underwater vehicles (AUVs), which will be managed by SeeByte’s autonomous networked acoustic communications system.

In addition, Sonardyne’s SPRINT-Nav instrument will also be integrated with the ARCIMS USV to provide an independent navigation reference in GNSS-denied environments.

Teamed with SeeByte’s adaptive, communication-aware, robotic behaviour developed for their autonomy system Neptune, the UMS will be able to operate in highly complex, variable and communications-limited environments. Automatic target recognition imagery snippets will be transferred acoustically using SeeByte’s novel semantic compression software.

This project will enable optimal uncrewed underwater vehicle (UUV) distribution for improved subsea communications and navigation in a range of challenging environments.

Ioseba Tena, Head of Defence at Sonardyne said, “Collaborative autonomy is part of the maritime defence road map. We need to enable more robots and have fewer operators in the underwater battlespace. Working alongside leaders in autonomy development like SeeByte, to make that vision a reality, as part of the Autonomy in Challenging Environments competition, is a significant step towards that goal.”

Andrea Munafo, Technical Program Manager at SeeByte said, “UMS operate in challenging environments and they need to be robust against faltering communications and navigation. Partnering with Sonardyne makes it possible for our autonomous systems to consider both during real-time execution and hence to improve the effectiveness of future underwater missions.”

DASA’s Autonomy in Challenging Environments competition is funded through the UK Ministry of Defence’s Chief Scientific Adviser’s Research Programme’s Autonomy Incubator project. Awards are made by DASA on behalf of Dstl.

The Autonomy Incubator project aims to: Identify and develop underpinning research and technologies to support the development and fielding of unmanned systems across defence. This work can be matured through the wider Dstl Autonomy Programme and other research and development programmes.

Dstl delivers high-impact science and technology for the UK’s defence, security and prosperity. Dstl is an Executive Agency of the MOD with around 4,000 staff working across four sites; Porton Down, near Salisbury, Portsdown West, near Portsmouth, Fort Halstead, near Sevenoaks, and Alverstoke, near Gosport.

Location: UK
















About Cookies - We use cookies to improve your browsing experience and help us improve our websites. For more information please click here.