3D Vision for Autonomous Robots & Industry 4.0


Infocomm - Artificial Intelligence
Infocomm - Enterprise & Productivity
Show more >


With the breakthrough in 3D vision-based localisation technologies under GPS-incapable environment and expertise in sensor fusion, this technology offer presents promising sensing solution for advanced robotics and automation in Industry 4.0. This solution is built upon the proprietary vision-based sensor with edge computing capabilities, providing best calibration algorithm and controls with the deployment of the multiple sensor technology.

The sensing technology enables robots to:

  1. Navigate under GPS-incapable environment, across indoors and outdoors spaces
  2. Vision-based self-localisation and moving object tracking
  3. Object identification and volumetric measurements in 3D

For Industry 4.0, these are the areas can be supported:

  1. 3D visual data acquisition and simple onboard processing, translating only required information to the main processing board
  2. Smart CCTV monitoring & image / object identification
  3. Depth perception and measurements for logistics

The technology owner is currently looking for operators that would like to transform their business operations, through the use of autonomous robots, and also to achieve the best effects of encapsulating a whole visual monitoring system to optimise their business operations on the ground. Besides, the technology owner is also looking to work with system integrators (eg. Robot makers) to adopt and integrate the technology. The system is fully customizable and configurable based on user’s requirements.


The solution can provide:

  • 360-degree omnidirectional coverage
  • Wide angle view of 178 degrees perception coverage per sensor
  • Flexibility in sensor placement and connectivity options
  • Faster response speed due to low latency onboard processing power
  • Embedded Visual Simultaneous Localisation and Mapping (VSLAM) allows autonomous applications in indoor space effectively without the need for Global Positioning System (GPS)


The application potential is wide and can be applied across several industries, due to the versatility in the core technology.

The primary area and focus will be in the following:

  • Autonomous Mobile Robots (AMRs)
    • Warehousing
    • Delivery Robots
    • Service Robots
    • Collaborative Robots
  • Static Infrastructures
    • Inventory warehouse tracking
    • Anomaly Detection
    • Volumetric and object identification analysis
    • Consumer behavioral tracking in retail stores

Another key market would be AI companies conducting analytics using acquired visual data for activities such as manufacturing fault detection and inspection.

Other markets that have been considered includes Defense Science System, Supply Chain & Logistics, General Security System, etc. which requires a blend of both surveillance and information processing requirements. This is to create an entire vision-based data monitoring and acquisition ecosystem on the ground working alongside with autonomous robots adopting the acquired monitoring visuals from our sensors to assist with manual repetitive work. The CAGR is expected to be at least 6.1% and has a potential of growing beyond USD13 Billion in market size by 2025 when expanded and adopted by more companies.

Unique Value Proposition

  1. Save time a cost required for R&D, as we provide ready-to-use hardware & middleware API. New API and hardware can be further customized and developed based on user’s requirement.
  2. Vision-based sensing for robots
    a. No pre-mapping required, reducing deployment time
    b. Able to operate indoors & outdoors
  3. Industry 4.0
    a. Data acquisition & processing over-the-edge, optimizing data flow
    b. Smart, collaborative communication between infrastructure & moving parts (eg. mobile robots), building a smart factory with an autonomous ecosystem
  4. Improve overall operational efficiency with a range of capabilities enabled by vision
  5. Safety and Return of Investment in machineries are enhanced, as robots are truly autonomous with minimal to zero human intervention required
Ultra-Thin, Stretchable and Sensitive Fabric Sensor for Sports Monitoring
Battery Swapping Technology for Electric Motorcycles
High Speed and Sensitive Artificial Olfactory Sensor
AI-Aided Analysis of Capsule Endoscopy Images
Intelligent Internet of Things (IoT) Vertical Farming for Sustainable Singapore
Contactless Palm Biometrics for Person Authentication and Identification
AI-Assisted Image Labelling Tool for Large Scale Data Labelling Efficiency
AI-enabled Virtual Modelling for Reduction of Energy, Carbon Dioxide Emission
Deep Neural Network (DNN) Approach for Non-Intrusive Load Monitoring (NILM)
High-performant Vector Database for Artificial Intelligence (AI) Applications