TECH OFFER

Autonomous UAV System Using Multi-Modality Sensor Fusion

KEY INFORMATION

TECHNOLOGY CATEGORY:
Infocomm - Robotics & Automation
Infocomm - Smart Cities
TECHNOLOGY READINESS LEVEL (TRL):
LOCATION:
Singapore
ID NUMBER:
TO174540

TECHNOLOGY OVERVIEW

The proposed technology is a fully integrated navigation solution for urban structure inspection. The system is built on a visual-inertial-range-and-LIDAR (VIRAL) fusion-based simultaneous localization and mapping system (SLAM). The VIRAL SLAM algorithm can work both indoor and outdoor in all lighting conditions. The team has developed some prototype devices with all the necessary sensors and codes and are compatible with popular drones.

In urban navigation, the GPS is likely to suffer from a multi-path effect. The drone is prone to perception noises that lead to crashes. The proposed system can effectively solve the aforementioned problem and can achieve mm-level accuracy.

The target users are drone or robotics service providers or other 3rd-party developers that create new onboard devices for autonomous operations.

The partnership model can be technology license transfer, research collaboration, and technology further development contracts.

TECHNOLOGY FEATURES & SPECIFICATIONS

The technology offer uses a tightly-coupled, multi-modal simultaneous localization and mapping (SLAM) framework, integrating an extensive set of sensors: IMU, cameras, multiple lidars, and Ultra-wideband (UWB) range measurements, hence it is referred to as VIRAL (visual-inertial-ranging-lidar) SLAM. To achieve such a comprehensive sensor fusion system, one has to tackle several challenges such as data synchronization, multi-threading programming, bundle adjustment (BA), and conflicting coordinate frames between UWB and the onboard sensors, so as to ensure real-time localization and smooth updates in the state estimates.

To this end, a two-stage approach was proposed. In the first stage, lidar, camera, and IMU data on a local sliding window are processed in a core odometry thread. From this local graph, new keyframes are evaluated for admission to a global map. Visual feature-based loop closure is also performed to supplement the global factor graph with loop constraints. When the global factor graph satisfies a condition on spatial diversity, the BA process will be triggered, which updates the coordinate transform between UWB and onboard SLAM systems. The system then seamlessly transitions to the second stage, where all sensors are tightly integrated in the odometry thread. The capability of our system is demonstrated via several experiments on high-fidelity graphical-physical simulation and public datasets.

POTENTIAL APPLICATIONS

The primary application is for various inspection cases such as for aircraft, ships, cranes or building façades.

The technology can also be used in other applications such as urban food delivery missions, warehouse asset checking, urban courier services, mapping & surveying, disaster zone navigation/mapping, or even navigation inside mining environments.

Unique Value Proposition

Take the inspection as an example, the traditional approaches usually :

  • Observe defects from the ground-taken photo from a professional engineer (PE), which often has a lot of false-negative results,
  • Employ a set of spiderman to hang over the roof ledge and perform a close-up inspection which leads to increase in costs, time and risk factors.
  • Manual piloted drone for closer inspection, which has higher risk factors when the drone is holding position by GPS/IMU fusion and avoiding the obstacle by inaccurate stereovision.

The proposed solution can simplify the inspection process by :

  • Using active LIDAR-based perception for ground truth level depth perception accuracy. This can significantly increase the safety factor.
  • The viral SLAM provides accurate localization results and holds the position based on the fusion output. It provides relaxation to pilot input and can allow the fully autonomous operation to lower costs and pilot time.
RELATED TECH OFFERS
Automatic Tile Grouting Robot
Smart Thermal Sensor
TV White Space Super Wi-Fi
Low-latency Digital Twin for Industrial Applications
3D Vision for Autonomous Robots & Industry 4.0
AI System for Real-Time Monitoring, Anomaly Detection and Predictive Maintenance
AI-based Optical Character Recognition Engine
Enhancing Construction Operations With Video Analytics
Real-time Machine Condition Monitoring System for Predictive Maintenance
Autonomous Materials Handling System