Invisible haptics targets augmented, virtual reality

Article By : Julien Happich

Ultrahaptics is keen to have an impact in AR/VR, although the company admits there are limitations imposed by the position of the actuators.

Start-up Ultrahaptics has recently introduced a development platform for companies to evaluate its ultrasound-based 3D mid-air haptics technology. EETimes Europe caught up with the company's CTO and co-founder Tom Carter to get an insight on the technology's latest developments.

[Ultrahaptics 01 (cr)]
__Figure 1:__ *Ultrasonic beam forming for haptics.*

The development platform features 192 piezoelectric ultrasound transducers (just under 14 x 14 units) forming an active matrix approximately 14cm x 14cm and about 10mm thick. When cleverly driven using proprietary beam-forming algorithms (involving time delays between adjacent transducers), phase shifts in the emitted ultrasounds are arranged so that peaks of ultrasound pressure can be felt several centimetres away from the solid interface, shaped nearly arbitrarily.

The invisible shapes created out of ultrasounds can be felt by hand and be accurately interpreted as virtual buttons, switches, dials or any other virtual object programmed by Ultrahaptics.

As he anticipated about 18 months ago, Carter confirmed the technology has been adopted by a number of OEMs for their products. "Some consumer products have been designed with our technology, they are finished and pretty much ready to go, but it is really up to the companies to make it public. It is at the whim of their product marketing departments, probably in the next few weeks," Carter said.

The CTO wouldn't even let us know what sort of products Ultrahaptics is getting into, but judging from the size and thickness of the current solution, most probably large-sized items such as white goods.

"The size of the hardware is a factor that determines which are the products where we can apply our technology" conceded Carter, "the smaller we can make it, the wider the market of course, but for now you won't find it in laptops or mobiles, there isn't enough space for it."

[Ultrahaptics 02 (cr)]
__Figure 2:__ *Ultrahaptics demonstration with visual feedback.*

"But we've made a lot of progress and we think we can shrink further our technology by an order of magnitude, with proprietary piezoelectric devices we could reach a solution about 1mm thick and cheaper too," he noted.

"In the last year, we've improved a lot the refresh rate of our haptic interface. We've gone from between 100 and 200 frames per second to 10,000 frames per second, which means we can create more complex shapes," Carter said.

A higher refresh rate means that the ultrasound signal can be multiplexed even more to create a richer feel with more "touch points" for a more pleasing and tangible interface.

"We've designed a lot of new sensations, and we are working on a software tool that would let users browse through libraries of effects and try or even edit the haptic effects to adjust them for their products," the CTO said.

In the lab, the company is even looking at creating textures on top of the shapes it emulates.

"We have the complete underlying technology to create textures, we are experimenting by modulating the frequency to get vibrations on the skin. Our biggest challenge to realize texture was the refresh rate, now we are able to focus "touch points" down to 4.3mm in diameter (from 8.6mm previously) and we can get the change of vibration at the right speed," Carter said.

Beyond white goods, kitchen appliances and possibly, the automotive cockpit, Ultrahaptics is keen to have an impact in augmented and virtual reality, although the company admits there are limitations imposed by the position of the actuators.

"I see our technology most useful in the work place, if you're doing CAD modelling, you could have haptics augmented reality in front of your computer, built into your desk. Here, no need to wear special gloves, you could see the 3D model designed by a co-worker through AR glasses and interact with it physically," Carter said.

"This would also work well in virtual reality scenarios if you sat in front of a desk, the visuals are really good and we could add the textures in haptics. For example you could feel the grain of wood and interact with objects more naturally," the CTO concluded.

This article first appeared on EETimes Europe.

Subscribe to Newsletter

Leave a comment