Talks data center strategy with new CPUs and Optane
TORONTO — Only months after announcing it would slowly wind down its 3D Xpoint collaboration with Micron Technology, Intel Corp. has outlined where it sees the persistent memory delivering the most benefits.
Its latest data center strategy includes two new members of its Xeon process family. The Xeon E-2100 processor is available immediately, while its Cascade Lake advanced performance processor will be released in the first half of next year.
The E-2100 processor is aimed at small- and medium-size business and cloud service providers to support workloads on entry-level servers, as well as across all computing segments for sensitive workloads that need enhanced data protections. Cascade Lake, however, is a new class of scalable Xeon processor, said Lisa Spelman, vice president and general manager of Intel Xeon products and data center marketing.
Cascade Lake is a multi-chip package that delivers up to 48 cores per CPU and 12 DDR4 memory channels per socket. She said it’s specifically architected for high-performance computing (HPC), artificial intelligence (AI) and infrastructure-as-a-service (IaaS) workloads.
But Intel is also re-imaging the memory/storage hierarchy to handle these workloads, and that’s where Optane comes into play. Spelman places Intel’s Optane DC persistent memory just below DRAM, the hottest tier. Below the persistent memory is Intel’s Optane SSD, followed by its own 3D NAND SSD with spinning disk sitting at the bottom.
“We really think this is a game changer for the entire application hierarchy and how applications are delivered,” Spelman said.
Intel’s Optane DC DIMM works in two modes. App Direct mode lets applications use the native persistence , while Memory Mode adds more volatile capacity and behaves like DRAM.
Prior to the Xeon reveal, Intel outlined its Optane DC persistent memory beta program for OEMs and cloud service providers. Spelman sees the technology working in combination with Xeon scalable processors through two special operating modes. The App Direct mode lets applications already tuned for the technology to take full advantage of product’s native persistence and larger capacity.
“It allows your application to select the persistent value of memory and put the most important data into the persistent socket of memory,” she said. In this mode, the data remains stored even when the system is powered down, which means the restart of a large data base could drop from hours to minutes to even just seconds. “It’s not sitting in volatile memory, it’s all just there,” she said.
In Memory mode, applications running in a supported operating system or virtual environment can use Optane as volatile memory — just like DRAM —and take advantage of the additional system capacity made possible from module sizes of up to 512 GB without needing to rewrite software. And like DRAM, the data is gone once the system is powered down.
The Optane DIMMs have been shipping since August and are DDR4 compatible, which means vendors don’t need to redesign systems to take advantage of the capabilities of opting for persistent memory, said Spelman.
Jim Handy, principal with research firm Objective Analysis, has always been bullish on the Optane DIMMs. “The ones who are really bullish about it are the guys who are making storage area networks because those guys are already buying these wretchedly expensive NVDIMM-Ns,” Handy told EE Times.
They cost significantly more than a standard DRAM DIMM of the same capacity and use flash to shuffle the data off into if there’s a power failure combined with either a large capacitor or a battery to keep everything alive. “What Optane is bringing to the party is that it’s going to sell for less than a DRAM DIMM of the same capacity and doesn’t need the external capacitor or battery, which also makes it a more reliable solution as well as taking up less room inside the cabinet,” Handy said.
Handy expects Intel will keep the costs of Optane below regular DRAM pricing as part its larger product offering because eventually the persistent memory technology will be profitable. It comes down to scale, he said. Optane unit shipments will have to be within an order of magnitude of the volume of the DRAM market for Intel to be able to get the cost down to the point where it can sell it for cheaper than DRAM and make a profit.
The persistent memory jointly developed by Intel and Micron appears to be on a course similar to that of NAND flash — hybrid storage arrays would put “hot data” on the faster media while other less important data was relegated to spinning disk. Intel’s approach is similar in that Optane is used when it makes the most sense. While it’s possible that Optane could be the persistent memory used in a system, it would more expensive than using it combinate with flash or a hard drive, said Handy.
Regardless of whether it’s branded 3D Xpoint or Optane, he sees it as being another entry into a hierarchy that’s been expanding over the years, and still believes hard disks will be an appreciative size of the market two decades from now. “New layers are getting inserted into what I call the memory / storage hierarchy,” he said. It used to be just DRAM and hard disk drive until a cache was inserted in between — then followed another caching layer. Flash added yet another potential layer. “Now it’s going to be DRAM and Xpoint and an SSD on a hard drive,” Handy said.
Although there was a lot hype and speculation around 3D Xpoint when it was first announced, Handy always had the attitude it was simply another layer. “It’s an exciting layer though.” And although he’s bullish on Optane DIMMs, he doesn’t see as much potential for Optane SSDs as there’s no interface fast enough to take advantage of the persistent memory’s speed. “Even NVMe is too slow and bogs down the speed of Optane.”
—Gary Hilson is a general contributing editor with a focus on memory and flash technologies for EE Times.