One of the mysteries -- or at least the cause of considerable confusion -- of successive approximation register (SAR) ADCs is calculating their exact power requirements at the system level. I have found that data sheets can be a bit tricky and very frustrating on this spec.
SAR ADCs provide a low-power means to measure input signals. Very often, the power consumption scales with the sample rate, making for a very efficient measurement system. This means that, to calculate the total power consumption of the ADC, all supply pins need to be taken into account.
Typically on SAR converters, there are three potential power consuming rails: the VDD supply, the reference input, and the digital interface IO supply.
The VDD supply provides power to the analog circuitry and ADC core.
On SARs that require an external reference, the reference input is a switched capacitor input that consumes charge current during the SAR conversion bit trials. This can be a significant source of power consumption, depending on the ADC throughput rate and the size of the internal capacitor DAC. The higher the ADC throughput, the more conversion bit trials (charging of caps) and therefore more current is consumed in the capacitive DAC array.
Similarly, a larger capacitor DAC means more capacitance to charge, which results in a higher current draw. If the cap DAC is large, it may pose a problem to the reference drive circuit, and a higher-power reference circuit may be required. The same is true for the analog input, where a more powerful driving amplifier may be required to drive the higher capacitive DAC load during acquisition. Sometimes additional circuitry related to the analog input can be powered from the reference, which can further add to power consumption. Some ADCs have an internal reference buffer that gives the reference input high impedance. In this scenario, the buffer supplies the necessary reference current through another supply pin.
The digital IO supply consumes power depending on the throughput/output data rate, as well as load conditions on the data output lines. Again, higher ADC throughput means more power consumed by the digital IO, due to the higher clock rates required to transfer the converted data. Any capacitive loading on the data output lines will increase the digital IO current because of charging and discharging. In high-throughput ADCs with high clock rates, the power consumed by the digital interface can become quite significant.
Many data sheets will quote power for the VDD supply only. You have to dig into the specification table to determine the reference and digital supply power requirements. To get an accurate measure of the power consumption from a system-level perspective, all three inputs need to be considered.
It is a mystery to me why some data sheets don't take all the specs into consideration when stating a requirement as important as power consumption. Has anyone else experienced similar issues?