Startup Mythic Working on Low-Power Inference Processor

Article By : Rick Merritt

Aims to map neural networks into NOR memory arrays

AUSTIN, Texas — Wedged between a coffee shop and a hair salon in a gentrifying suburb here, a couple dozen engineers are exploring a new direction in computing. Startup Mythic aims to map neural networks into NOR memory arrays, calculating and storing results in ways that shave power consumption by perhaps two orders of magnitude.

If it works, the startup could leapfrog digital processors and cores from the likes of Intel, established IP providers, and a handful of well-heeled startups in China. They all aim to fill sockets in next-generation surveillance cameras, drones, factory gear — all sorts of embedded systems trying to hop on the bandwagon to artificial intelligence, including someday self-driving cars.

“We had known from grad school that mixed-signal processing was a great fit for this app,” said David Fick, who launched the company with a colleague at the University of Michigan. “You need to store a lot of weights and flash memory with its adjustable threshold voltage — every transistor is very appealing.”

The flash arrays essentially eliminate the need for moving data in and out of external memory, slashing power consumption. Mentors David Blaauw and Dennis Sylvester “had started some flash research and we had some expertise, so we could pretty easily spin up a project,” said Fick of his work with co-founder Mike Henry.

But executing on the decades-old concept of an analog processor in memory is tricky. “You have to account for many analog effects — mismatch, noise, temperature, and memory cells have a similar amount of significant effects,” he said.

Unlike digital computers with well-defined memory, processing, and storage subsystems, analog computers for machine learning are essentially one big integrated behemoth.

“You need to co-design everything simultaneously, so you need people who understand overlapping sectors like device and neural network people who understand each other’s areas,” said Fick. “We’ve been a lot more successful in that than others with a great team that can do the whole stack.”

Indeed, the company snagged its first big investment, a $50 million Series B, in part because it pulled together a diverse team of director-level experts. They include an analog specialist from Texas Instruments, a flash design director from Microchip, and a physical design expert from Netronome.

Dave Fick in his Austin office with his dog, Ellie, the startup's unofficial director of emotional support. (Images: EE Times)

Dave Fick in his Austin office with his dog, Ellie, the startup’s unofficial director of emotional support. (Images: EE Times)

Showing stepwise progress with a series of prototype tapeouts also won over investors. Fick gives a lot of credit to VLSI work in college.

“When you design a chip as a grad student, you have to do memory, synthesis, DRC variations … all by hand. If you go straight to industry, you never see the whole process, so a lot of startups coming out of academia are more successful going to production.”

Both co-founders have been geeks from the start. Fick’s first job in high school was as a web developer. As a grad student, he had a string of internships at companies including AMD, IBM, and Intel. For fun, Henry used to enter speed-programming competitions.

Big and small rivals and software hurdles

These days, the duo has large and small competitors. As many as 40 established and startup IP and chip providers are said to be shipping or planning some form of client AI accelerators.

The list of rivals includes multiple well-staffed and funded startups in China. Horizon Robotics, one of the most promising, is already shipping low-power client AI accelerators using a more traditional digital architecture.

Startup Syntiant is, like Mythic, developing a processor-in-memory architecture using flash. It has a team of several former Broadcom engineering managers and backing from Intel Capital. IBM Research is exploring an accelerator for machine learning based on resistive RAM, but Fick thinks that Big Blue is taking the wrong approach.

“They are trying to get the perfect memory to make everything easy, but we are getting ahead by co-designing everything … even if they found the ideal memory, there’s always one less perfect that could be lower-power or faster,” he said.

Novel parallel processors, historically, have failed because they were too hard to program. The emerging processor-in-memory chips face the same problem in spades because machine learning itself requires a new and still-evolving programming model.

Mythic provides what it calls a development platform that acts like a compiler, turning a neural net described in TensorFlow into machine language for the chip. It uses PCI Express to link to its chip, providing “hints about how to get extra performance out of the chip and some optimized example networks for common applications,” said Fick.

Customers who want to use a framework other than TensorFlow apparently will need to translate their work using ONNX. It is one of a handful of emerging tools for translating among a handful of AI software frameworks.

Fick is well aware of the software hurdles that his customers face.

“In order to get into this space, you need to hire some deep-learning scientists, and those people are very expensive because they are in short supply … [creating data sets and neural nets and training them] is time-consuming and expensive … that limits who takes the plunge and invests in this space.”

The good news is that the Mythic chip’s memory array should be able to handle a greater variety of convolutional and recurrent neural nets than rivals. And its performance boost could open doors for running more complex models in power-constrained systems at the edge.

The startup has taped out several test chips to date.

The startup has taped out several test chips to date.

Mythic has a couple of big partners on its side. Lockheed Martin hopes to use the chips in future drones. Fujitsu supplies the startup’s flash technology.

So far, two interesting applications seem to be out of range. Smart speakers with silicon budgets of a few dollars are too cost-conscious for Mythic’s targets. Self-driving cars have automotive-grade requirements, which the startup cannot afford to take on right now.

Mythic aims to have 40-nm silicon available by the end of the year. That’s the node that the embedded flash cells are designed in, and it fits the startup’s low-cost target.

Fick notes that the flash cells have been qualified for 28 nm, a next logical step for the company. Beyond that, foundries are working on embedded MRAM and ReRAM cells.

“Nothing prevents us from going to the smallest nodes — we get benefits from scaling,” he said.

But if Mythic is successful, it will not be because of Moore’s law and the kinds of digital processors that it made popular. It will be because it got traction taking computing in a new direction.

— Rick Merritt, Silicon Valley Bureau Chief, EE Times

Subscribe to Newsletter

Leave a comment