Without consistent interoperability APIs, automakers cannot deploy sensor assembly from multiple vendors. It also stymies development of a well-designed system that can survive successive generations of hardware...
Carmakers are adding sensors to vehicles in a push to offer drivers a slew of driver-assist and highly automated features. OEMs and Tier 1s, however, still lack standard APIs to link the sensors and applications that run on a vehicle’s compute system.
Without an interoperable API, carmakers cannot update sensors supplied by one vendor with new ones from another developer; instead, they must rewrite code from scratch. As it stands, their vehicle application software cannot correctly control streams generated by the new sensors.
The Khronos Group, together with the European Machine Vision Association, revealed Monday the launch of an “exploratory group” to investigate requirements for embedded camera and sensor API standards. As an open consortium, Khronos is known for its chops in graphics and compute interoperability standards. The European Machine Vision Association (EMVA) has decades of experience in machine vision technology.
Application software developers need access and control over a real-time stream generated by sensors so that they could, for example, double the speed of inference software or more easily impose their own image recognition in a system, explained Neil Trevett, president of the Khronos Group. The new initiative plans to solve “not just the issue of portability, but the increased functionality” demanded by highly automated vehicles.
Trevett touts the group as “one of the first industry forums by which both the hardware and the software community can come together and actually figure out whether we can do better than we have done.”
No pre-determined outcome
The operative word is “exploratory.” Just like a politician considering a run for higher office launches an exploratory committee to assess constituent interests, the two groups are setting up the Exploratory Group to gauge the industry’s appetite for open, vendor-independent royalty-free API standards to control embedded cameras and sensors.
In an interview with EE Times, Trevett said, “We don’t have a predetermined outcome.” The group might conclude that such an API is not needed. If so, so be it. Either Khronos or EMVA — or even a different organization — could initiate standardization efforts. Participants of the Exploratory group might suggest an open source project.
“We’re opening the exploratory group at zero cost to anyone who’s interested,” stressed Trevett. “We want to genuinely discover what the real needs of the industry are. We’re putting requirements and use cases first.”
Where does API sit?”
The API Trevett envisions is “an API that sits between the application and the sensor/cameras/ ISP hardware.”
Software developers want to enable applications that directly control an image stream generated by sensors for downstream processing, explained Trevett. They are looking for the level of programmable control over, for example, the lens, the exposure, or anything else on a per-frame basis, he added.
Their goal is “to do increasingly sophisticated processing for environmental understanding,” Trevett explained. A stream of images generated by sensor assembly – which includes cameras, radars or lidars – would feed into a DSP or GPU or another kind of hardware accelerator controlled by compute APIs such as OpenCL or Vulkan, he added.
The inability to generate the right set of images in real time with sufficient control over various parameters in a number of different sensors would make it much tougher for accelerator hardware to do the downstream mix, he explained.
Today, “everyone is stuck with proprietary API,” said Trevett. “If I am at an Embedded Vision Summit, for example, people chase me around the hall and ask, ‘why doesn’t Khronos have a camera API?’”
More integration already happening
Today, more chip vendors are working to integrate more smarts closer to sensors. Recogni, for example, boasted recently that the company plans to plug a very high-performance inference engine into the sensor module.
Given that SoCs will lift the integration curve, Egil Juliussen, a veteran auto industry analyst asked, “Won’t this initiative of developing APIs between a camera module and a compute hardware become a moot point in five years?”
Not necessarily, according to Trevett. He said this initiative doesn’t seek to standardize the hardware interface between a sensor and a compute system.
For system vendors, the connectivity on the hardware level — plugging together a sensor and a camera module — is only a first step. The next step, far more important to carmakers developing the next-generation vehicles or machine vision guys designing factory floor solutions, is to write applications that make their systems operate safely and accurately.
To do that, software guys must get down to a sensor and use a sequence of frames generated by the camera, explained Trevett. “The traditional camera module vendor will give you a very bare bones API to control whatever there is on their module. And very often, because those modules are typically designed by hardware vendors, the APIs they are using are very hardware centric. It’s a level of API that you’d have to do poking and reading registers.” If it is close to sensor processing, like an ISP, module vendors will often hide that and render an image with just one or two high-level functions.
Imagine a system integrator who deployed a camera module tightly coupled with an inference engine in Toyota. What if that integrator were asked to deploy the same solution in Mercedes? They’d have to rewrite everything again. What if a module guy upgraded their solution to a much higher resolution sensor that changes all the registers? Once again, they’d have to start from scratch.
This issue involves both generation-to-generation and platform-to-platform changes, Trevett noted.
To recap, without a consistent interoperability API, automakers cannot deploy solutions from multiple vendors. It also stymies development of a well-designed system that can survive successive generations of hardware, Trevett stressed.
How proprietary is an image sensor interface?
Phil Magney, founder and president of VSI Labs, told EE Times, “The output of the vision sensor can be raw whereby you can do what you want with it, even though raw formats are not standardized. Most sensors convert the raw data into JPEG or TIFF formats using proprietary software. Many sensors are sold in this format.”
“On the other hand,” he added, “Mobileye adds a ton of IP to the sensor to create the module, and in that module resides processing IP and software that is necessary to support their feature detectors, classifiers and even control logic.” In Magney’s opinion, “Mobileye is really a system, not just a sensor. OEMs and developers don’t love this approach because it is closed, and you cannot get to the raw output to apply your own algorithms.”
Magney, however, added, “But if you want ADAS and don’t have the skills to build out the solution, you go with Mobileye and you can bring your solution to market faster. This is what gave Mobileye so much penetration into production ADAS over the past ten years, or so.”
The times are changing, though. Magney added, “OEMs, tier-ones and developers want to do their own thing. So, the closed approach is less than ideal. The aim of this working group is to move toward standardization to remove some of the limiting factors.”
Hard lessons learned
The Khronos Group’s Trevett was open about the fact that not every Khronos initiative to set API standards has succeeded like their OpenGL, Vulkan or OpenCL.
Between 2013 and 2015, Khronos worked on what it calls “the OpenKCAM API standard” for camera control used in mobile phones. The idea then was to create “an open, royalty-free standard for advanced, low-level control of mobile and embedded cameras and sensors.”
The group had hopes of creating a standardized way to control a camera in a mobile phone, but it never launched the spec, said Trevett. This is because in the end Apple and Google created proprietary APIs that dominated the mobile market. With “no demand left for OpenKCAM,” the Khronos group abandoned the project.
Calling this a lesson learned, Trevett said, “I’m hopeful that we won’t suffer the same fate” with embedded camera and sensor API standards this time around.
One reason for optimism is that the embedded space for machine vision and automotive is more diverse. “You have systems with eight cameras, you have systems that mix visual and lidar and radar, and you have real time requirements between all these multiple sensors,” Trevett pointed out. “It’s a much bigger problem space.”
“The Google camera API is doesn’t solve any of that stuff,” he added.
Another reason this effort might succeed is that no one platform vendor can decide a result with the stroke of a pen, he added. “There are hundreds of players wanting to integrate hardware and software and use sensors in various ways. So, there is no big benign dictator to say, okay, this is what everyone’s going to use.”
While the OpenKCAM spec was never publicly released, the Khronos group noted in its web page, “Elements of the OpenKCAM design may be reused if they helpmeet the requirements of an agreed Scope of Work produced by the Embedded Camera API Exploratory group.”