While competitors focus solely on cars, NVIDIA targets autonomous machines in general
TOKYO — Most AI platform suppliers have been obsessed lately with autonomous vehicles. This week, Nvidia escalated the obsession by spreading the epidemic to “autonomous machines.”
At Nvidia’s GPU Technology Conference held here, CEO Jensen Huang wound up and pitched Nvidia AGX, a series of embedded AI high-performance computers built around Nvidia’s new Xavier processors, for a host of robotic and autonomous machines.
Phil Magney, founder and principal advisor at VSI Labs, called Nvidia “shrewd” to extend the reach of the architecture, since most competitors are focusing exclusively on automated cars. “As we know, there are lots of human driven machines out there where removing the operator is the goal. Nvidia’s new partners in Japan have their bases covered with these announcements.”
Indeed, Huang announced that Yamaha Motor Co. has selected Nvidia Jetson AGX Xavier to develop the system for autonomous machines that will include unmanned agriculture vehicles, marine products and “last-mile” vehicles.
Among an industry chorus singing Nvidia’s robotic tune and committing to the Jetson AGX Xavier system are FANUC, Komatsu, Musashi Seimitsu and Kawada Technologies.
Magney added, “Even factory automation is covered with FANUC’s plan to apply AGX Xavier to its factory automation solutions.”
Partnerships with these big names in Japan lends Nvidia not only credibility but also momentum in robotics and AI applications and development.
The key idea Nvidia is promoting is that “All autonomous machines run on desired input with associated desired output,” said Magney. “So, if you have the data, you can train your automated machine to do almost anything.”
What’s so special about AGX Xavier?
Nvidia is bringing more computing power to the new AGX Xavier platform.
The big difference with the new platform is that it’s “based on the Xavier SoC, while the previous Drive PX was based on the Parker SoC,” said Magney. “In addition, the DevKit comes in single or dual SoC versions (Pegasus).”
Notably, Devkits are “end-to-end solutions,” he noted, “as they come bundled with SDKs such as DRIVE Software 1.0 and DRIVE IX for driver monitoring applications.”
The edge for Nvidia’s efforts to build a large eco-system comes from its core competency. Nvidia understands and actually provides solutions that enable both learning and inferences.
At GTC here, Huang unveiled the Tesla T4, which he described as “a universal inference accelerator” based on Nvidia’s new Turing GPU architecture. Nvidia’s T4 GPU and TensorRT software promise to process the queries that power such services faster than any competing platform, up to 40x faster than CPUs alone, the company said.
To help companies supporting deep learning-powered web services with GPUs, Huang also introduced the TensorRT hyperscale platform. This allows a GPU-powered server to run multiple deep learning models and frameworks concurrently.
Nvidia’s efforts in developing these AI infrastructure building blocks are paying off nicely as Nvidia expands its eco-system for development. Magney said, “You get the hardware, software tools, access to libraries, and network training tools. Of all the suppliers, Nvidia’s advantage is in the diversity of the DevKit as no one has this collection of hardware, software and development support.”
Nvidia’s deal with Yamaha caught the attention of industry analysts like Magney. He called the announcement interesting, “because they will develop solutions for various off-road vehicles, commercial or recreation.” He speculated that Yamaha might apply it to even boats.
“For manufacturers of recreational and commercial ground vehicles, the development of automated platforms is absolutely necessary,” Magney said. “For a company like Yamaha (who must support a variety of different platforms) it is better to unify around a common architecture that is scalable and can be applied to different vehicle platforms.”
Nvidia also boasted that its DRIVE AGX platform integrates key sensor manufacturers. “Sony’s 8-megapixel automotive camera, Panasonic’s depth-sensing camera, and automotive electronics supplier Omron’s 3D lidar sensor” can now operate seamlessly with the Nvidia DRIVE platform, Nvidia claimed.
This offers “greater diversity for various types and brands of sensors,” Magney noted.
The big takeaway is that Nvidia now has “driver support for these devices within its SDK.” He said, “This is one of the key elements of a development kit — driver support for various types and brands of sensors.” Building drivers takes times and effort. By supporting these devices, Nvidia saves developers time otherwise spent on writing device drivers, Magney observed.
— Junko Yoshida, Global Co-Editor-In-Chief, AspenCore Media, Chief International Correspondent, EE Times