The company utilises machine vision, machine learning, and augmented reality (AR) in order to be able to give real-time feedback if a process or a component has been done correctly or not. This is achieved by modelling through the algorithms a process performed by an expert and followed by setting-up a monitoring workflow tied to the production line. Alternatively, the verification can be done straight from a computer-aided design (CAD) model.
At the end of the process an automated report can be created with the results of the build, with a record of who worked on it, under what conditions was it built, and the inspection results. The company build their own cameras and software, thus, it can be customised to required specifications.
The company has also developed their own AR indoor and outdoor navigation system. Thus, by combining this system with the platform mentioned above it is possible to guide a user to a certain place (i.e. where a specific part is located), track, monitor, inspect, and verify a process, material and people flow, real-time production updates and live inventory tracking.
The technology consists of hardware and software. The hardware consists of proprietary cameras which are customised specifically for the tasks required. These can be a combination of normal cameras, infrared for depth sensing, or even thermographic cameras. It all depends on the task at hand, and accuracies needed. The company also provide servers if the industry client is exploring on-premise needs.
The software is composed of machine learning algorithms, and the required interfaces in order to allow the communication between all the different software and hardware such as cameras, sensors, phones, tablets, laptops, headsets, servers, CAD software, etc. This is what then enables the possibility to do real-time verification with AR as well as AR indoor and outdoor navigation and tracking.
The company will provide guidance on how to carry out procedures through AR.
With this technology platform, the monitoring and inspection workflows can be automated. In this way, it can immediately be ascertained if workers have completed certain tasks correctly.
This platform can also function as a knowledge transfer technology, providing users tutorlege opportunities with skills, tools, and capabilities which they previously did not have. Users will be able to go through different camera lenses to see the environment in different ways such as infrared, thermographic, or a normal camera feed, all of which perspectives will be augmented with valuable information that can better enable users to make decisions.