LAS VEGAS — Where do IoT, AI and Quantum Computing intersect? The short answer is that they meet where data is growing exponentially. The long answer is... well, it’s complicated.

Last week, a panel at the Design Automation Conference (DAC) drilled deep into this difficult topic. Participating were the winners of the “2019 Under 40 Innovation Awards” — young engineers and researchers working on next-generation technologies.

The panelists, all eager for the dawn of a new era in design automation, called upon the electronic design automation (EDA) industry for more sophisticated tools to help them advance the Internet of things (IoT), artificial intelligence (AI) and even quantum computing.

But given the slowdown of Moore’s Law, what more can design automation contribute to electronic design? How exactly does EDA connect with AI?

Vijay Raghunathan, professor of electrical and computer engineering at Purdue University, explained: “One of the things EDA did for designs in general is to convert what was basically dark art [i.e. electronic design] into a very structured science.”

Raghunathan continued: “If you talk to AI researchers today, you realize the design of these algorithms and neural networks is really a dark art. One of the most interesting things going forward is to see if EDA can bring the same level of rigorous structure to the design of neural networks and design of AI algorithms.” He added, “To me, that is one of the most interesting aspects that connects the world of EDA with the world of AI.”

2019 Under 40 Innovation Awards Panel at DAC
From left to right: Huichu Liu, a staff research scientist at Intel; Vijay Raghunathan, professor at Purdue Univ.; Robert Wille, professor at Johannes Kepler Univ.; Rasit Onur Topaloglu, senior hardware developer and program manager at IBM
From left to right: Huichu Liu, a staff research scientist at Intel; Vijay Raghunathan, professor at Purdue Univ.; Robert Wille, professor at Johannes Kepler Univ.; Rasit Onur Topaloglu, senior hardware developer and program manager at IBM

New-generation researchers also expect EDA tool vendors to jump into quantum computing — not in a few years, but today.

Robert Wille, a professor at Johannes Kepler University in Linz, Austria, said, “Everybody knows about Moore’s Law and design gaps we’ve experienced in decades with conventional computing technology.” Despite the widespread belief that the era of quantum computing is still far away, Wille stressed: “It makes absolutely sense for the design automation industry to start developing efficient and sophisticated EDA tools for Quantum Computing -- right now.”

What is quantum computing for?
The panel, consisting of winners of this year’s Under-40 Innovation Awards, included Huichu Liu, a staff research scientist at Intel, Rasit Onur Topaloglu, senior hardware developer and program manager at IBM, Wille, and Raghunathan.

Yunji Chen, professor at the Institute of Computing Technology, Chinese Academy of Sciences, was another honoree, but couldn’t participate because he was not able to get a U.S. visa in time.


Recommended
Blog: IEEE, Tear Down This Wall


Given their diverse backgrounds, the panel covered a lot of ground, touched on diverse topics that ranged from IoT to AI and quantum computing.

For a layman, quantum computing is a mystery beyond solution. Asked why the world needs quantum computing, Wille said, “First, it’s important to understand that the quantum computer is not replacing the conventional computer we know.”

Instead, he said, it will be one of the many computing technologies of the future. The tech world has high hopes for quantum computers, which show promises, for example, in accelerating the exploitation of huge search bases.

Robert Wille
Robert Wille

“But the first killer app for quantum computing is what’s known as Shor’s algorithm — a quantum algorithm — developed years ago,” Wille explained. “The algorithm will help make factorization much more efficient. This is still considered as the holy grail of quantum computing, as [many believe] this will change the entire world [in theory].” But in reality, “This is still the furthest away.”

In recent years, as big industry players such as IBM, Google and Microsoft Research jump into the fray, the quantum computing community is seeing the emergence of new commercial applications. For example, quantum computers can be used “for simulating climate change, solving optimization problems … or… quantum chemistry is a huge topic,” said Wille. 

So, although the quantum computer’s big goal of factorization is still far away, “We see today a variety of applications where [the use of] quantum computer may be beneficial,” said Wille.

Vijay Raghunathan
Vijay Raghunathan

As promising as this prospect might sound, Purdue’s Raghunathan cautioned that there are big challenges in quantum computing. “From an outsider’s perspective, I see there is a class of very hard problems — computationally hard problems — where quantum computing holds a lot of promise.” Solving optimization problems is one, by computing in the exponential space, for example, Raghunathan. “But the challenge is, how do you really extract true benefits [of quantum computing] for a class of really wide-spread hard computational problems which seem to be all over the place?”

Infrastructure for quantum computing
Obviously, none of the young innovators expects the EDA industry to stand still. By stressing the need for automation tools for quantum computers, Wille said, “We need them to find out what’s possible to fabricate and what’s possible to design.” He said, “We need to be prepared for the day when quantum computing becomes scalable.”

IBM’s Rasit Onur Topaloglu, another young innovation award winner, noted, “We also thought about this problem at IBM…we asked, when do we need automation tools to design quantum computing?”

Rasit Onur Topaloglu
Rasit Onur Topaloglu

He said, “We’ve concluded that up until 200 qubits, maybe we can still do it manually.” He added, “I am not going to project when we will reach 200 qubits, but we already have an 80 qubits architecture.” Although there is at least a seven year-gap from academic research to a product, he concluded that the research for design automation tools for quantum computing needs to start today. “When the idea [of quantum computing] takes off, we need those tools in the industry right away.”

Where AI and IoT meet
Everyone on the panel agreed that AI and IoT are inseparable. Raghunathan stressed, “In the world of AI, data is the king. He or she who has data rules.” He said, “To me, IoT is one way of solving the problems by being eyes and ears in gathering a lot of data to power AI algorithms that make sense.”

Raghunathan concluded: “IoT isn’t just about gathering data. It’s about drawing value and useful inferences from the data. That’s where AI can really provide the answer.”

Intel’s staff research scientist Huichu Liu, who now works on the AI accelerator at the Movidius group, concurred. “While IoT is the eyes and ears, AI is the brain.” In today’s sensing world, “We see things, we hear things, but we need intelligence to interpret information.” This IoT/AI paradigm allows the industry to develop energy-efficient devices at the edge that can process data, or even reduce response time, she explained. In turn, this will open opportunities to use IoT/AI for “product defect detection, healthcare, agriculture and a lot more,” she added.

Where does quantum computing intersect AI?
OK. We all seem to agree that AI and IoT are a marriage made in heaven.

The panel, however, was less clear on where exactly AI and quantum computing will meet. Wille said, “It is a tough question…I am not so sure where they exactly meet. But my take on that is both AI and quantum computing are technologies to cope with complexity.”

He explained, “Maybe, if we can kind of understand the problem from the problem formulation point of view but suffer from the complexity with respect to search space, maybe quantum computing is the solution. But if I can’t really formalize how to solve the problem right now, maybe AI is the answer.”

So, AI and quantum computing aren’t about “matching” technologies, Wille said, but the issue illustrates a need to understand “what technologies to use for what purposes.”

Raghunathan said, “If one speaks, IoT produces data and AI just listens. That’s a definition of a perfect marriage.”  But quantum computing and AI? He said, “They are perhaps not meant to be. It’s best not to marry the two. They are both very useful technologies, but they are ideally suited for different applications.”

No single tech solution
More significantly, it is time to acknowledge a future in which “there won’t be the only single technology to solve problems,” explained Wille. “There will be much more diverse methods and technologies to pick from in solving different problems.”

This is “totally different from what has happened over the past decades,” Wille pointed out.

There were, of course, different computing technologies. But they were somewhat similar, all based on ones and zeros. In contrast, we’re approaching a future in which we must figure out ways to apply different methods and technologies to solving different problems. Wille said, “We already have specialized circuits, quantum computing, DNA computing, photonics silicon and many more devices… this whole thing will get more and more diverse. That’s where the future of technologies will be.”

IBM’s Topaloglu divided AI in two phases — training AI models and using AI models to make predictions. He pointed out that the training stage is where all the complexities reside. “This is the area where people are hoping that quantum computing could help,” he said.

No one is betting, however, on quantum computer to accelerate the AI training process exponentially. “I think this type of speedup will not be exponential. Rather, it will be three or four times speed up, if we have the right size machine with reasonable error controls.”

Huichu Liu
Huichu Liu

Intel’s Liu, on the other hand, sees AI and quantum computing as two technologies complementary to each other. “These days, when we are thinking of AI, we want to solve our problems closer to the human level. In contrast, the good thing about quantum computing – from what I gather by reading books – is that it can solve the complex problems beyond the human level.” Liu remains optimistic. “If we can divide the problems into different applications, maybe we can find more opportunities” for both AI and quantum computing.

Bigger problems on the horizon?
As the electronics industry explores AI, quantum computing and other fields, the worries grow about an arms race in new technologies among different geographical regions.

Asked if they have encountered any “Oh My God” moments when learning about advanced technology developments in different parts of the world, Purdue’s Raghunathan cited his surprise at the amount of resources either governments or industry consortia are throwing into such areas as AI and machine learning. The amount of investment in AI fields in both Europe and China dwarfs the capital coming into U.S. institutions. “It’s just a matter of time for other geographical regions to reap the fruits and leapfrog us,” he noted. 

For Wille, coming from Europe, the geographical differences are less of an issue. In his experience, collaborations between different geographical regions are already happening. “The bigger challenge for us is collaboration between disciplines. In working on quantum computing, we need to talk much more to physicists, mathematicians, and people working on theory. Geographical collaboration is quite OK right now. But collaborations between communities of different disciplines are much harder to establish.”

Interdisciplinary research collaborations are tough, agreed IBM’s Topaloglu. “Until we have pervasive solutions,” which quantum computing is nowhere near, remote collaborations usually don’t work. “We need to co-locate people with different disciplines in order to improve collaborations.”

That’s exactly what IBM is doing, according to Topaloglu. “We have physicists sitting next to mathematicians, and next to engineers in the same location.” Some innovations and progresses can happen. Otherwise this will go much slower. It’s because “people get into silos and share what they developed in their own silos…but it is not going to be as good as what could have been developed by being co-located.”