This technology offer is a small, secure, low power and cost-effective hardware module with a dedicated convolutional neural network (CNN) accelerator. The hardware accelerator supports many pre-trained artificial intelligence (AI) models, allowing edge processing at the IoT sensor node itself. As such, real time AI inference can be done without the need for high bandwidth communications to the cloud. Simple API calls through one of many communication interfaces makes adding AI capability to IoT nodes an easy task.
At the heart of this module is a dual core 64-bit processor with floating point unit (FPU), and a dedicated convolutional neural network (CNN) accelerator achieving 0.23TOPS under 300mW (module only). As security is a priority when designing this module for IoT use cases, hardware-based security is used to secure the module when communicating with cloud providers such as Amazon Web Services and Google Cloud Platform. The module’s dimensions are 33mm x 17.8mm. Development kit is available.
This technology is targeted at edge inferencing for IoT nodes, such as:
According to MarketsAndMarkets, the overall edge AI hardware market is expected to grow at a CAGR of 20.64% during the forecast period of 2019-2024. Key drivers for the edge AI hardware market include growing demand for edge computing in IoT, and dedicated AI processors for on-device image analytics.
The following are the customer benefits: