Analog Compute: Key to the Next Era of AI

Article By : Tim Vehling, Mythic

Analog computing architectures offer power advantages for AI at the edge in industrial robots, security cameras and more.

As AI applications become more popular in a growing number of industries, the need for more compute resources, more model storage capacity and, at the same time, lower power consumption is becoming increasingly important. Today’s digital processors used for AI applications struggle to deliver these challenging requirements, especially for large machine learning models running at the edge. Analog compute offers an innovative solution, enabling companies to get more performance at lower power consumption in a small form factor that’s also cost efficient.

The computational speeds and power efficiency of analog compared to digital have been promising for a long time. Historically, there has been a number of hurdles to developing analog systems, including the size and cost of analog processors. Recent approaches have shown that pairing analog compute with non-volatile memory (NVM) like flash memory – a combination called analog compute in-memory (CIM) – can eliminate these hurdles.

Tim Vehling Mythic Analog Compute
Tim Vehling (Source: Mythic)

Unlike digital computing systems that rely on high-throughput DRAM that consumes too much power, analog CIM systems can take advantage of the incredible density of flash memory for data storage and computing. This eliminates the high power consumption that comes with accessing and maintaining data in DRAM in a digital computing system. With the analog CIM approach, processors can perform arithmetic operations inside NVM cells by manipulating and combining small electrical currents across the entire memory bank in a fast and low-power manner.

So, while digital processing systems struggle with increasing deep-learning workloads and higher power consumption, analog CIM can perform real-time processing, even with multiple large, complex deep neural networks at a fraction of the power of a digital processing system.

Significant power advantages are gained by being able to perform massively parallel vector-matrix multiplication and addition operations inside flash memory arrays. Tiny electrical currents are steered through flash memory arrays that store neural network weights, and the matrix multiplication results are accumulated through a series of analog-to-digital converters. By leveraging analog compute for inference operations, the power overhead of DRAM access and digital compute can be eliminated, and a large drop in total AI inference processing power consumption can be achieved.

There are also many second-order system level effects that deliver a large drop in power; for example, when the power consumption is cut up to 10x lower with analog computing, the thermal management system can be dramatically simpler with no need for active cooling.

Cost, latency advantages

Analog computing systems also offer cost advantages since processors with embedded NVM can be manufactured in mature semiconductor process nodes. These process nodes are typically lower cost and have far broader supply chain availability compared to bleeding-edge nodes where all the capacity is consumed by a handful of companies. Furthermore, the analog CIM approach makes it possible to use a single flash transistor simultaneously for storing neural network weights and performing multiply and accumulate operations. This provides very dense storage of neural network weights and high performance AI processing in a single chip without the added cost of external DRAM and its associated components.

Another benefit of analog CIM systems is that they can be extremely fast since they do not suffer from the latency of data propagating through digital logic gates and memory in the processor and written and read out of external DRAM. Massively parallel matrix operations can be performed on-chip, at a fraction of the time it takes in a digital processing system. This speed makes analog CIM systems ideal for computing-intensive AI workloads such as video analytics applications for object detection, classification, pose estimation, segmentation and depth estimation.

There is huge demand for faster processing in the industrial sector where robots running computer vision applications are used to improve productivity and safety. Drones are another market where analog CIM will drive new types of capabilities. Traditionally it’s been challenging to equip drones with high-definition cameras for computer vision applications that require running complex AI networks locally to provide immediate and relevant information to the control station. Processors that use analog compute make it possible to process these workloads locally while also being extremely power-efficient – enabling drones to make longer trips.

Thanks to these capabilities, we’ll increasingly see drones used for agricultural monitoring, inspecting critical infrastructure such as power lines and inspecting fire damage.

Security cameras and surveillance solutions are also ideal for analog CIM processors. In legacy systems, cameras capture images of people and objects and send the video streams to a central video processing system – whether that it is on-premise on in the cloud – for visual analysis; this is where the issues of privacy and data security come into play. A better alternative is to have cameras that use trained AI algorithms to detect specific sequences – accidents, crimes, or other events – and only send the metadata of the analysis, or just the footage of potential security incidents for analysis. Enabling video security systems to process most data at the edge can help allay privacy concerns while still protecting the public, whether it’s for traffic monitoring, incident detection or another critical security application.

All in all, analog compute is an ideal approach for AI processing because of its ability to operate using much less power while achieving low latency in a small form factor. The power efficiency of analog compute technology will enable product designers unlock new AI applications, even in small edge devices, for years to come.

This article was originally published on EE Times.

Tim Vehling is senior vice president for product and business development at Mythic.

Subscribe to Newsletter

Join the Conversation

  1. Paul Creaser says:

    Interesting article.

    Aspinity are one example of a commercial Analog AI core. I guess there are others?