Arms Race for FPGA Security Never Stops

Article By : Stefano Lovati

Ensuring the highest possible level of security in highly integrated programmable devices is a persistent concern.

Ensuring the highest possible level of security in highly integrated programmable devices is a persistent concern, given the exponential growth of FPGA applications in the data center, aerospace and defense, automotive, industrial, and telecommunications sectors. In an escalating “arms race” between chipmakers and hackers, industry introductions of increasingly refined and robust security mechanisms in programmable devices prompt attackers to shift strategy, hunting for remaining component weaknesses and devising ways to exploit them.

EE Times Europe recently spoke with two leading figures in the development and production of programmable devices to learn more about device security: AMD-Xilinx executives Manuel Uhm, director of silicon marketing, and Jason Moore, senior director of core markets engineering. Below, we summarize our conversation with Moore and Uhm on the main types of attacks and the countermeasures introduced by manufacturers, then follow up with a Q&A section in which the executives offer further insights.

Securing FPGA devices

The techniques exploited to compromise FPGA security and data integrity include reverse engineering, side-channel attacks (such as differential power analysis [DPA]), thermal laser stimulation, bitstream alteration, readback attacks, physical probing, and environmental attacks. Designers have sought to protect devices, while providing confidentiality, integrity, and authenticity, through secure boot (using a hardware root of trust) and encryption, said Moore. In response, however, attackers change their strategy. If you encrypt the bitstream, they go after the key.

“The state of the art today is to store the key in an encrypted form,” said Moore. “The common approach followed by manufacturers is to make things harder for the adversary, [but that] translates into higher costs and higher complexity.”

A hardware root of trust inserts a public key and a signature configuration in the device. The part then will accept only those files that have been signed by the legitimate owner. The authenticity scheme also provides cryptographically strong integrity, which means that if a single bit is wrong, the entire device will fail.

To get the key or affect the part’s operation, adversaries may attempt fault injection. Manufacturers therefore try to guard against a range of potential faults, including voltage glitching, electromagnetic faults, and clock glitching. Faults can even be created with a laser.

“You will see vendors do different things to help provide a level of robustness in the system that performs the authentication,” said Moore. “Many vendors add redundancy to their devices, adding a level of resistance against fault attacks.” The cost and complexity of the redundancy scheme will increase with the number of targeted fault types.

Uhm noted that AMD-Xilinx’s 16-nm and 7-nm devices “have a number of security features in hardware that were not available previously, providing extra safety and security at the chip level. This can give us an advantage over devices that may not have as many security features built in. In addition, the programmable logic can be used for custom security features that can run orders of magnitude faster than in software.”

Environmental attacks against FPGAs are common, Moore said, because manufacturers of ICs generally don’t test their devices outside of the datasheet ranges, nor do they guarantee correct operation outside of those ranges. Consequently, adversaries like to operate an IC outside of its specified operating range, trying to force the IC to fail in a way that creates a vulnerability.

“The way to avoid that is to integrate specific circuits, like temperature and voltage monitoring, into the device,” said Moore. “From inside the part, you can determine the value of those environmental parameters, and if they go out of the range, the chip can take actions” such as erasing the key or issuing a reset.

Test and debug ports are another potential source of attack. Indeed, the test and debug circuits are probably the most exploited circuitry in an IC because those ports are necessary; you need to have JTAG and test capabilities to develop, debug, and test a system.

“As soon as you enable security in a chip, you will have to disable the test and debug ports permanently or at least limit the mode in which they can be operated,” said Moore. “That depends on the specific vendor.”

Q&A: Winning the security arms race

EE Times Europe: FPGAs, like many other advanced ICs, have internal voltage and temperature monitoring. Are there other environmental factors, like radiation, that adversaries can exploit to mount an attack?

Jason Moore: Clearly, radiation can affect the device and could become an attack vector. It depends on the adversaries that customers want to protect their systems from and on the environment the device will operate in. If you are building radiation-tolerant or radiation-hardened devices, you don’t have to worry about that specific attack vector, because protection against radiation is already embedded into the device. But if you are designing for commercial use or only for terrestrial radiation, it is up to the integrator whether to address that specific attack [vector] or not. If they do [want to address it], they will need to add countermeasures, such as sensors, to detect the radiation environment.

EETE: Does the security scheme affect the device performance during the loading of the FPGA image (boot time)

Moore: Yes, there is a performance impact, and you will never get away from it. That’s because after you sign a bitstream, you will need to do signature verification whenever you load it. That is part of asymmetric cryptography, like RSA, and this signature-verification process takes time. The signature-verification time does not exist for FPGA images that are not secured when loaded. Depending on how many signatures you have to verify when you load the configuration image, it will take more or less time. [But] you can minimize this time by increasing the bus width or the clock rate of the configuration circuitry.

EETE: Are the protection features you’ve mentioned available to the user as a whole?

Moore: For a given family, there is a common set of security features that are available as a whole. This is true in our devices as well as others’. You only have to enable them. With every [device] generation, vendors make a conscious effort to continually enhance the security of the IC. It takes a couple of years to design and manufacture an FPGA, and once you have built and shipped it, you have to ensure a very long lifetime. On a software product, you can perform updates to [address] security vulnerabilities, but [not] on the hardware side.

IC manufacturers have a finite amount of time to design security into a product, [but] adversaries have unlimited time to find [vulnerabilities to] exploit. So we are always in this kind of losing proposition. As a manufacturer, we have to always look ahead [to the next device generation] and design security by trying to predict new potential forms of attack.

EETE: Suppose I am a developer and I want to encrypt the FPGA image. What do I do?

Moore: We provide development tools to our customers to create and encrypt their bitstream or configuration images in their environment, so they will never have to release the keys.

Manuel Uhm: We don’t want to publicly give too much detail about the security technology inside our programmable devices … because we don’t want to give insight to black hats or malicious folks. We sell to a number of markets that rely upon security, such as aerospace and defense, automotive, and industrial applications, so we are very sensitive to these matters. We are in touch with the most current landscape, which can be advantageous compared with vendors that may not sell into such target markets.

Moore: The industry in general is doing a pretty good job against those series of attacks, improving things generation after generation. That’s why providing security to programmable devices is a kind of arms race.

Additionally, security is becoming ubiquitous. It’s no longer [just] a matter of aerospace and defense [requirements]; security is becoming paramount also in the automotive, industrial, and data center markets.

 

This article was originally published on EE Times Europe.

Subscribe to Newsletter

Leave a comment