Using Aizip's Visual Wake Words model, Maxim's MAX78000 neural-network microcontroller detects people at just 0.7mJ per inference.
Using Aizip Inc.’s Visual Wake Words (VWW) model, Maxim Integrated Products Inc.’s MAX78000 neural-network microcontroller detects people at just 0.7mJ per inference—100 times lower than conventional software solutions, and the most economical and efficient IoT person-detection solution available.
The MAX78000 low-power, neural-network accelerated microcontroller executes AI inferences at less than 1/100th the energy of conventional software solutions to dramatically improve run-time for battery-powered edge AI applications, including building energy management and smart security cameras. The mixed precision VWW network is part of the Aizip Intelligent Vision Deep Neural Network (AIV DNN) series for image and video applications and was developed with Aizip’s proprietary design automation tools to achieve greater than 85% human-presence accuracy.
“The combination of Maxim Integrated’s ultra-low power chip solutions and Aizip’s compact AI models is an important development that will enable many novel and exciting applications in the IoT world,” said Professor Bruno Olshausen at UC Berkeley, a highly recognized expert in neural computation/neural network models who also serves as an advisor to Aizip, a company focused on AI for applications in the Internet of Things (IoT).
“The MAX78000 architecture, toolchain, and example code and models made it easy to get started and hit our accuracy, latency and power targets on schedule,” said Yuan Lu, Co-Founder and President, Aizip.
Robert Muchsel, Maxim Integrated Fellow and architect of the MAX78000 microcontroller, noted, “Aizip was quick to exploit our per layer quantization capability to reduce weight storage and achieve a compact, energy-efficient model for human detection. I look forward to working with them on future projects.”