Microsoft has given a look inside its HoloLens and the Holographic Processing Unit that drives it at the Imec Technology Forum in Brussels, Belgium. The HPU is among an emerging class of specialty accelerators.

In late March, Microsoft started shipping a developer’s version of HoloLens, its novel augmented reality goggles. The release generated a flood of teardowns but until now they lacked commentary from the headset’s designer.

“We have showed HoloLens for last 18 months, focusing usually on the experience and the software—this is the first time we will talk about the hardware,” said Ilan Spillinger, corporate vice president of HoloLens and silicon at Microsoft.

The HPU at the core of the headset is essentially a data fusion sensor. It takes inputs from an array of sensors on the HoloLens include four environmental sensors, a miniaturised Kinect depth camera and an inertia measurement unit. It accelerates algorithms that track the user’s environment, movements and gestures and displays holographic images.

The 28nm HPU is essentially a highly customised DSP array running at less than 10W max. It includes an unknown number of Tensilica DSP cores optimised to run hundreds of HoloLens-specific instructions.

Each core is customised for a particular function and subset of instructions. In what sounds like a non Von Neumann architecture, each typically has its own unique organisation of related memory units.

It accelerates “new style algorithms that need special local memories and a unique memory architecture, not a typical level 1-2-3 cache,” he said.

EETA HoloLens 01 Figure 1: The HoloLens HPU rides a sleek motherboard tailored to fit the headset. (Images: EE Times)

The headset is powered by a 14nm Intel Cherry Trail SoC with embedded graphics running Windows 10. The two-sided motherboard also contains 64Gb flash storage and 2Gb external memory split evenly between the HPU and Cherry Trail SoCs.

Spillinger would not comment on the road map for the HPU except to say he “sees opportunity for running algorithms we didn’t think about.”

The HPU fits roughly in the category of a new accelerator Google announced last week for its data centres as well as one in the works at a start-up.

Spillinger called on semiconductor engineers to pave a road to higher performance, lower power chips to help him build lighter, cheaper headsets packing more sensors and features.

The HoloLens chief started his career at Intel working on Centrino, its first dedicated notebook processors. Later, he moved to IBM designing Infiniband and Power chips and later helping Microsoft and Nintendo develop ASICs for the Xbox 360 and Wii consoles.

In late 2007, Spillinger joined Microsoft and started work on the Kinect. The project later merged with efforts by other engineers to develop an augmented reality headset and the HoloLens project was born.

Bellying up to the sensor bar

EETA HoloLens 02

The HoloLens sensor bar (above) packs four environmental cameras for tracking head movements and gestures used to control the display. A depth sensor is a Kinect scaled to a fraction of its size and power consumption. It supports a short range mode for tracking gestures within a meter and a long range mode for mapping the room. A 2MPixel high def video camera projects images the user sees.

A look at the optics

EETA HoloLens 03

The optics subassembly (above) includes an inertial measurement unit meeting tailored specifications. The optics use gratings developed and manufactured by Microsoft. They support a wide inter-pupil distance and adjust for users wearing contact lenses or glasses.

An LCoS display supports 2.3 million points of light and an accuracy that lets users project and read even fine text on a Web browser.

From prototype to finished product

EETA HoloLens 04

An early prototype (above) used a full-sized Kinect depth sensor and various sub-assemblies eventually redesigned into a sleeker consumer form factor that packs a hefty bill of materials (below).

EETA HoloLens 05