AI, and the Real Capacity Crisis in Chip Design

Article By : Stelios Diamantidis, Synopsys

It's the dawn of the self–designing chip. AI-based chip design can relieve some of the strain on engineering teams and maximize productivity.

Chip industry veterans are used to the cyclical nature of semiconductor supply and demand, but the ongoing chip shortage has been particularly tough for many. Supply chain disruptions will likely persist in the coming years and the semiconductor sector is unlikely to return to old norms.

There’s a more pressing crisis on the horizon, however, that will bring the semiconductor industry to its next turning point: The lack of engineering throughput will remain unless we optimize the chip design process.

Persistent chip shortages appear to be due to relatively short–term economic factors. But if we start thinking about chip design in a different way, it could  offer new opportunities for advancements in chip production. Disruptions in semiconductor design certainly didn’t start the global chip shortage, but it’s doing its part to exacerbate the crisis.

Talent crisis

Stelios Diamantidis, senior director of Synopsys AI Solutions

Supply-demand economics dictate that when a supply shortage occurs, demand will quickly drive new investment to fill the supply gap. Why is that not the case with the current supply crisis?

Part of the reason is that every chip is specifically designed and optimized to the specifications of a fabrication process. Photomasks, used to produce a pattern on a substrate, are extremely rigid and can’t be easily re–designed to accommodate different specifications. Think analog vinyl records: Once music is engraved onto the lacquer disc, you can’t modify the vinyl to play a different tune.

The effort involved in optimizing a design for multiple target processes is at minimum uneconomical and practically infeasible. The reason is it not only requires doubling the workforce within a design team but also necessitates access to expertise. This is where the chip industry is hitting another wall.

A critical shortage in technical talent is catching up with the chip industry and everything it powers, all while short–staffed teams are facing more complicated challenges. The chips that enable seemingly everything in our everyday lives — from mobile devices, home appliances and cars to the industrial equipment that manufacturers use to create these machines — are growing in complexity.

These ever-advancing technologies put increasing pressure on chip designers to keep pace with the demands of consumers, market leaders, corporate competition and stakeholders, thereby stretching the limits of what — and how quickly — manufacturers can produce. Moore’s Law predictions lurk and a new AI-based approach could be what’s needed to avoid a reckoning.

AI for chip optimization

AI is enabling new technology realms such as autonomous vehicles and smart devices. Many of these AI–powered applications require power-hungry and complex chips. The level of research, experimentation and management required to design these devices is surpassing human capabilities.

As customers continue to reveal how overloaded and under–resourced engineering teams are, it’s crucial that we find ways to free them of the tasks that could be handled by AI.

Engineers continue to develop AI-based tools operating independently, analyzing huge data streams and learning from experience. These same engineers developing AVs, smart devices and complex statistical calculations that use machine learning to process data at lightning speeds are now relying on the very technologies they produce: It’s the dawn of the self–designing chip. Autonomous chip design can remove some of the strain on engineering teams and maximize productivity.

Transitioning chip design from a manual to an automated process isn’t a new concept. Since the 1980s, engineers have used software tools to deploy automation and improve chip design. EDA tools are a necessity for modern chip design, but the appetite for bigger and better technologies — which in turn requires bigger and better chips — continues to grow. Design engineers are learning that AI can help them fill that demand.

Take, for example, digital implementation, which is one of the most complicated aspects of the IC design process. Place–and–route tools have mostly maintained the pace of progressive technologies, determining where to place logic and IP blocks as well as how to route the traces and interconnects.

The inputs to place–and–route tools involve a vast search space of potential solutions. These span functionality (macro–architecture), form (micro–architecture) and fit (silicon technology). Processing and analyzing all these data manually requires extensive time and resources. AI technology could significantly reduce the load by discovering new ways to optimize design.

One approach is to implement design space optimization (DSO), a model that utilizes ML to evaluate chips for improved design solutions. Prior to DSO, human engineers would use design space exploration, the process of manually combing through terabytes of data from a variety of inputs.

This manual effort is strenuous and requires exhaustive experimentation often hindered by human limitations. Data are available to optimize design, but there’s too much and is too complex for engineers to handle. Engineers end up partitioning data to make it more manageable, thereby restricting design potential.

AI can use reinforcement learning technology to generate robust optimizations and continuously analyze results for further enhancement at a much higher rate than human designers. As AI learns from experience and expands its abilities, it becomes the engineering team’s catalyst for earlier tape-outs that meet power, performance and PPA goals.

AI is emerging as an engineer’s ideal assistant, capable of accomplishing tasks, generating and analyzing design data and producing faster results than humans. That frees up engineers to dedicate more of their time to value–added work, improving chip efficiency, discovering bugs and differentiating designs. Leveraging AI, design teams can also focus on reducing power leakage and enhancing chip performance.

AI implementation has already begun, leading to record–breaking design productivity. Tasks that would normally take an entire team months to complete, individual engineers could accomplish within weeks thanks to AI.

Science fiction? Well, Samsung has already announced silicon designed by AI with the help of Synopsys’ AI–based system DSO.ai. Japan’s Renesas Electronics also achieved formerly unattainable PPA using AI tools to meet time constraints weeks ahead of schedule while boosting maximum frequency by hundreds of megahertz.

Synopsys DSO.ai optimization process. (Source: Moor Insights & Strategy) (Click image to enlarge)

AI can improve more than just speed. A North American integrated device manufacturer was able to achieve a 10-percent improvement in total power at the SoC level in just weeks by replacing manual methods with AI tools.

Which brings us back to the chip shortage. What if AI was able to help bridge the productivity gap, allowing design teams to optimize content not only for different market needs, but also across different process technologies? Could today’s rigid photomasks become easily retargeted by AI to meet the needs of a dynamic global economy also driven by AI applications? Could trained AI assistants remaster creative silicon content similar to the way records were transformed into multi–track digital media?

Potential benefits for small and large companies include lowering workforce requirements. AI also addresses rising chip design demands and the prevailing talent shortage. As automation spreads, we’re becoming increasingly comfortable with handing our metaphorical and tangible keys to AI. In a bit of poetic symmetry, AI for chip design could be the key to future chip innovation.

This article was originally published on EE Times.

Stelios Diamantidis is the senior director of Synopsys AI Solutions.

Subscribe to Newsletter

Leave a comment