Direct visual SLAM is the state of the art in 3D computer vision. As compared to traditional feature-based SLAM, this Direct Sparse Odometry approach works reliably in featureless, repetitive and complex outdoor environments.
The technology offer presents a software that adopts direct visual SLAM. It can turn 2D images from off-the-shelf cameras into a precise understanding of the position and 3D environment, thereby enabling accurate, robust and safe navigation of autonomous robots or advanced spatial intelligence applications. Additionally, it does not require expensive systems, such as LiDARs, and is more independent and reliable than Global Navigation Satellite System (GNSS) or HD maps.
Besides cameras, the input from IMUs, GNSS, wheel odometry or LiDAR may be optionally fused for increased reliability or to address edge cases of particular applications.
The technology consumes sensor information and provides input for planning and control systems of autonomous robots or vehicles. The real-time positioning, 3D point clouds and geo data are also suitable for fleet and smart city applications.
- ROS 1 & 2 supported; other interfaces can be customized
- GNSS-independent positioning indoors and outdoors
- Weather and season independent visual positioning
- Different AI features to address edge cases, e.g., dynamic object masking for operation in crowded and dynamic environments
- Wheel odometry and other sensor fusion extensions optional
- Up to 100Hz output rate of 6DoF poses; other outputs available: point cloud, confidence, velocities and more
- Outputs not only accurate horizontal but also vertical positioning
- Large-scale map creation supported by cloud service for centralized map creation, maintenance and distribution to all devices
The technology is available as a hardware reference system and software for development.
The positioning and mapping software is suitable across multiple applications. It has been optimized specifically for autonomous outdoor robots and material handling equipment in the logistics sector.
- Autonomous Valet Parking: Parking vehicles in garages and other GNSS-denied environments without available HD maps
- Augmented Reality Navigation: 6DoF position combined with HD maps to create detailed visual guidance instructions for drivers
- Advanced Driver Assistance System (ADAS) & Autonomous Driving (AD): Higher-level ADAS systems and fully autonomous driving supported with different software modules ranging from VIO dead reckoning to HD positioning in HD maps
Material handling and robotics:
- Material Handling and Logistics: Automation of equipment in gated areas such as logistics yards, ports, airports or factories
- Service Robots & Last-mile Delivery: Autonomous last mile delivery or security robots that need to navigate GNSS-denied tunnels and urban canyons, in highly dynamic scenes
Fleet and smart city:
- Fleets: Tracking and management of mobility, logistics, garbage collection and other fleets with cm or lane-level accuracy for optimized routing or coordination with other services
- Geo Info Systems: Collecting high-accuracy position, 3D and image data for digital twins enabling deep insights into cities, assets and traffic conditions