Micron Invests $100 Million in AI

Article By : Rick Merritt

Californian coming-out party not quite what you'd expect; a $100 million venture fund

SAN FRANCISCO — Micron Technology announced a $100 million venture fund with a focus on AI at a coming-out party for its new chief executive here. Sanjay Mehrotra, former CEO and co-founder of SanDisk, vowed to accelerate new product introductions at the company best known as one of the survivors of consolidation in commodity DRAMs.

It’s a challenging time for the memory and storage giant that trails larger rivals Samsung, SK Hynix, and Toshiba. Breakout products such as the Hybrid Memory Cube and Automata processor failed to gain significant traction and 3DXPoint memories co-designed with Intel hit delays, with revenues for Micron now pushed out to 2020.

On the positive side, the company hopes to sample 3DXPoint devices late next year. It will also enter the market for high-bandwidth memory stacks, initially with HBM2 in 2019.

In its core business, Micron has a DRAM-like product in the lab that it aims to sample in 2021. It has a process technology roadmap with three new nodes ramping into production. In addition, it is privately showing customers software to accelerate NAND storage performance 3x to 5x.

At the event here, executives made the case that the rise of deep learning will help drive demand for memory and storage. Its new fund aims to help it ride the wave with future investments in hardware, software, and services.

“This is not just about the money, but technology partnerships as well,” said Sumit Sadana, Micron’s chief business officer and a former executive at SanDisk, Freescale, and IBM. “We have some emerging memory technologies not yet in production that we will seek partners to help bring to market.”

Micron researchers are exploring the kinds of processor-on-memory architectures that startups such as Mythic and Syntiant hope to pioneer. They are also bullish on so-called neuromorphic chips that use a mesh of synapse-like cores that companies such as BrainChip are pursuing.

It’s unclear when or how Micron will be able to turn the concepts into products.

“We have a lot of innovative work in memory and future architectures, some of them are focused on deep learning,” said Sadana. “We have grants from the U.S. government to investigate frontiers of processing in memory and deep-learning acceleration, but a lot of the work is extremely sensitive and confidential.”

It’s still early days for machine learning, said speakers from Amazon and Microsoft.

For example, developers are working to enable Amazon’s Alexa to understand context as well as multiple commands in a sentence, said Prem Natarajan, who heads up natural language work for the web giant. Microsoft recently released a service to let companies create their own voice assistants and aims to enable them to someday engage in realistic conversations, said Lili Cheng, who leads Microsoft’s AI research.

AI training could require 7x more DRAM and 2x more NAND in future servers. (Image: Micron)
AI training could require 7x more DRAM and 2x more NAND in future servers. (Image: Micron)

Micron execs remain bullish on its core DRAM roadmap as well as the outlook for 3DXPoint in both main memory and storage.

The company expects to use 3DXPoint in both dense, fast main memory products and storage that delivers lower latency than Intel’s current Optane drives. Micron’s stated plan of sampling in late 2019 for revenues in 2020 suggests that it is waiting for a second-generation 3DXPoint device that it is co-developing with Intel.

Executives declined to provide any product details or market projections. “Our belief is that [3DXPoint] becomes a meaningful part of the data center hierarchy that will cannibalize a nibble on either side [of DRAM and NAND], but the aggregate market grows quite a bit,” said Tom Eby, who runs Micron’s compute and networking group.

“3DXPoint is still in its infancy compared to DRAM and NAND,” said Jeff VerHeul, who oversees Micron’s development for it and non-volatile memories. “It will evolve and become more efficient.”

In its core DRAM sector, “we see at least three technology nodes beyond the one we are working in now … we have more visibility than we have had in the last decade,” said Scott DeBoer, head of tech development at Micron. That said, “the cost reduction and bit-density increase per node is slowing and is not at the pace of the last 20 years.”

Micron’s DRAM roadmap does not require extreme ultraviolet lithography and is already using double- and quad-patterning. Advanced circuit designs are enabling operation with less charge stored in taller capacitors than used in the past, but otherwise, the overall architecture is a conventional one, added DeBoer.

Micron is already shipping DRAMs made in its sub-20-nm 1x process and expects first revenue for some made in its 1y process next quarter, said Eby. The transition to 16-Gbit chips from today’s mainstream 8-Gbit designs is near, but he would not comment on the viability of a 32G design.

— Rick Merritt, Silicon Valley Bureau Chief, EE Times

Subscribe to Newsletter

Leave a comment