
Page 34: of Marine Technology Magazine (January 2025)
Read this page in Pdf, Flash or Html5 edition of January 2025 Marine Technology Magazine
FEATURE AUTONOMOUS NAVIGATION
Image credit Submaris and EvoLogics
Greensea IQ’s new IQNS edge processing and inertial navigation system has been paired with the company’s defense-focused
EOD Edge software package.
EvoLogics has introduced a new series of its Quadroin
AUV that includes an AI object recognition module.
Image courtesy Greensea IQ
Its miniaturization extends to its georeferencing capabil- Greensea IQ’s new IQNS edge processing and inertial navi- ity. Where other surface systems require the triangulation gation system has been paired with the company’s defense- of three acoustic positioning systems (USBLs), Advanced focused EOD Edge software package to provide enhanced
Navigation has packed them into a single unit. Adding to the autonomy and platform stability for applications such as auto- sensor fusion designed into the inertial navigation system, matic target recognition. It has been ? eld tested on a VideoRay it means greater navigational accuracy, says senior software Defender ROV, and David Pearson, Technical Solutions Ar- engineer Dr Alec McGregor. chitect, says the software can perform advanced onboard ana-
Water Linked, a company that was part of the NATO DIANA lytics that enables the IQNS to process data from the forward- accelerator program, has developed a 3D sonar system, 3D-15, looking sonar to detect and classify features in real-time. For that provides imagery in low-visibility environments in real time example, it can identify and track potential obstacles and map – something 2D sonar cannot achieve without combining data them to aid navigation.
from multiple scans. “Each acoustic ‘ping’ generates a complete “We’re also looking ahead to future applications of edge 3D image without the need for scanning, motion compensation processing. For instance, we’re developing generative AI or post-processing,” says Johnny Broeders, Marketing Special- tools that dynamically prioritize tasks and adapt missions in ist. “When moving around, an AUV, ROV, USV, boat or diver real-time based on environmental changes or new operator can immediately create a single 90x40 degree clear 3D point goals.” Additionally, the company is looking to use edge AI cloud of what is seen underwater. This is especially useful in to support operations with vehicle-mounted manipulators for underwater terrains where you can’t pick up anything with a precise intervention tasks, such as object retrieval or repairs. camera. With our API you can then merge the sonar data with Dr Alexander Philips Head of Marine Autonomous and Ro- position information and third-party software to create an exten- botic Systems (MARS) at the National Oceanography Cen- sive map of the underwater area. The sonar output also has the tre, sees machine learning as bringing more ef? cient mission potential to increase autonomous navigation capabilities under- planning, moving away from pre-de? ning waypoints to goal- water and is a big step in achieving full spatial awareness.” based planning and high-level objectives. “There are a couple
EvoLogics has introduced a new series of its Quadroin AUV of ways we can put more advanced algorithms into operations. that includes an AI object recognition module based on input We can put algorithms on board the vehicle, for example if from forward-looking side-scan sonar and two full-HD under- you want the vehicle to react to what it is measuring and the water cameras. Francisco Bustamante, Director Sales and Op- sensors that it has on board. That’s within the con? nes of erations, says: “New processing capabilities have increased to what’s on board the vehicle. the point that the vehicle’s options can now include real-time “But we could also do what we’re calling server-side autono- object collision avoidance that recognizes the objects and my. This is looking at a much bigger set of external data sets, navigates around them. And while before a vehicle would col- such as weather and ocean forecast data, ship AIS data and lect data which could be reviewed only after the mission had even satellite data, to inform where your vehicle needs to go. been completed, onboard AI processing can nowadays detect The algorithms do their work onshore and send the informa- in real-time objects of interest and reduce the amount of re- tion that’s needed to the AUV. quired data transmitted to the point that the operator is noti? ed “In the future, combining these approaches could be really in real-time about the ? ndings.” powerful.” 34 January/February 2025
MTR #1 (34-47).indd 34 2/4/2025 9:11:52 AM