Understanding PCB/FPGA power architecture (Part 1)
One of the more interesting aspects to start looking at is the power architecture of the design. How do we go about powering the FPGA (and other devices) on the board? Normally, the system will have an intermediate voltage, which comes from an AC/DC convertor or another form of DC supply that powers the system. The first stage of the design is to specify this interface correctly in terms of the design's voltage and current requirements. Determining this intermediate voltage is the easier task of the two, as the required current will have to account for any inefficiencies in the downstream convertor(s).
The first stage in defining the power architecture is a determination of all the voltage rails and the currents drawn by each of these rails. For example, consider the FPGA-based imaging system as shown below.
In the case of such a system, you may have a number of voltage rails, such as the ones shown in the following table.
For this particular example, let's assume that all of the power supplies have a requirement to be within ±5%. As can be seen from the above table, the highest voltage is 3.465 V, which is the nominal 3.3 V at its maximum acceptable tolerance. Knowing this value allows us to determine the voltage supplied by the AC/DC or other DC supply within the system. The sensible thing to do here is to select a convertor that has an output compatible with the 3.3 V required, thereby saving a conversion stage and increasing the overall efficiency.
The next stage is to determine the power required by each of the rails. This requires that you use power estimation tools such as Xilinx XPE and read the datasheets for other devices to ensure you can determine the total amount of power required. I tend to collate all of this in a spreadsheet, as this comes in useful later on when we are determining the conversion architectures.
As you can see above, when I calculated the power required by the board, I performed two calculations—one for the nominal and one for the maximum power. This is because, at this point in time, I have not yet calculated the maximum rail voltages provided in the worst case by the convertors. Therefore, I have assumed they will be at maximum voltage. This is important, since it is needed to determine the power required in the worst case by the AC/DC convertor (you should always design to address worst case requirements). While the difference (146.5 mW in the example above) is not large in this case, it could be in a larger system.
|Related Articles||Editor's Choice|