Advertisement

Blog

The Elusive 12-Bit D/A Converter

Look on any of the major analog manufacturers' websites and you'll find lots of 12-bit D/A converters. Do any of them really qualify as 12-bit devices? I mean really .

A little history is in order here.

The earliest commercially available D/A converters were plastic-packaged modules… typically 2 inches by 3 to 4 inches by ½ to 1 inch thick, containing a circuit board populated by discrete devices, sometimes hand-selected, or at best a few low-level ICs. They were big and bulky, but they were the only game in town. And many of them were 12-bit accurate over temperature, including gain, offset, linearity, and all other errors. Just to clarify, 1 least-significant-bit (LSB) at 12 bits is 1 part in 4096 or 0.024 percent.

One company producing such devices was Analogic, headed by a brilliant engineer and entrepreneur named Bernie Gordon. Analogic was making a lot of money selling these converter modules, and wanted to protect their market. Bernie waged a marketing campaign in the trade press against the inevitably lower-price monolithic D/A converters, telling anyone who would listen that it was not possible to produce a true 12-bit current-switching D/A converter in monolithic technology, and it would never be possible.

In the 1970s, various companies tried to produce D/A converters with 12-bit resolution and accuracy. In 1977, Donald Comer of Precision Monolithics, Incorporated (PMI, later acquired by Analog Devices) presented a paper at the ISSCC on a monolithic 12-bit D/A converter. The paper is available on the IEEE site if you are a member. In the corner of the chip, the metal mask included a sketch of a heart with a bite taken out of it, and the initials “B.G.” in the bite. The message was: “Eat your heart out, Bernie Gordon. We have done what you said was impossible.”

It did not take Bernie long to respond. In a letter to George Rostky, editor of Electronic Design magazine, the leading trade magazine of the day, Bernie made several points.

Bernie Gordon's paper describing his converter and a letter to George Rostky. Click here to see a larger version with all pages.  Reprinted with the kind permission of Analogic.

Bernie Gordon's paper describing his converter and a letter to George Rostky. Click here to see a larger version with all pages.
Reprinted with the kind permission of Analogic.

One was that the PMI D/A converter used an additional layer of thin-film resistors deposited on the chip, and that disqualified it from being a monolithic device. That was kind of a weak argument, since all ICs use a deposited layer of interconnect metal.

The next point was that the PMI D/A converter was not quite complete, needing a reference, an output amplifier for current-to-voltage conversion, and bypass capacitors. This was definitely true, and adding a 12-bit-accurate reference would require some discrete components and some kind of manufacturing adjustment.

However, Bernie made some valid points regarding temperature drift and long-term drift. Whether they were actually needed is arguable. Bernie's position was that if you needed a 12-bit accurate device at the time a system was sold, you probably also needed a 12-bit-accurate system a year or two later. Of course, most instruments are re-calibrated and performance-certified periodically to take out the biggest errors, but re-calibrating the linearity of a laser-trimmed or Zener-zap-trimmed D/A converter isn't really feasible.

As time went on, the price-performance ratio of 12-bit IC D/A converters was too attractive to resist, the market decided they were “good enough,” and shifted away from the more-expensive modules. Analogic turned its attention to developing and manufacturing systems (ironically, many of which included the much-maligned IC converters).

I looked through a few websites to see if anyone has yet made a “true” 12-bit D/A converter… guaranteed over temperature, time, and including all error sources. I didn't find any. Some are pretty close, at least hitting the linearity specs, and even holding them over temperature. Even 16-bit linearity is being met by some devices. Getting the full-scale accurate to 0.024 percent with an on-chip reference is a lot harder.

Will we ever get there? Was Bernie Gordon right in 1977?

Acknowledgments:
Dan Sheingold (ex-Analog Devices) and Walt Kester of Analog Devices provided some fact-checking and filled in a few details for this post.

Related posts:

21 comments on “The Elusive 12-Bit D/A Converter

  1. eafpres
    August 15, 2013

    @Doug–interesting history review.  Seemingly a lifetime ago when I was a young engineer and had been assigned to revamp data acquisition in a flow measurement laboratory at the National Bureau of Standards, in Boulder, CO (now NIST), I installed an HP1000 real-time computer with loads of I/O and wrote all the software to not only acquire data from lots of sensors, but also do all the calibrations.  In our process the majority of the sensors were 4-20 mA, and we put precision resistors inside the temperature-controlled control room (which also housed the computer) to convert to voltage, then sampled those with the A/D cards on the HP.  We had to calibrate all the critical measurements every day before running any experiments.  This included calibrating pressure sensors with actual loads on a piston, and some of the temperature sensors.  I don't recall all the details now, but basically, absent a sensor malfunction, we calibrated out any slow drift and non-linearity every day.  By keeping the current to voltage resistors in a controlled temperature enviroment, we minimized short-term drift there.

    It was a huge effort any day we wanted to run.  I recall going into the lab around 4:30 a.m. or so to get everything done so we could get the planned test runs in that day.

  2. Scott Elder
    August 16, 2013

    I think the last time I saw an ADC specification that reflected the quoted resolution was the AD574 at 0.5LSB–a 30 year old part.

    At one time, quoting resolution was quoting linearity.  Today, one can buy a 16 bit ADC and invariably the only thing that is 16 bits is the size of the readout register.

    Earlier this year I compiled a list of the 50 most popular 16 bit SAR ADCs and only ONE had a specification where INL=0.5 LSB.  The worst part had INL = 12 LSBs.  Interestingly, the worst one was very close to the top of the popularity list whereas the 0.5 LSB one was near the bottom.

    The prices for those same 50 parts looked like a random scatter plot going from under $5 to over $20.  The only correlating factor was better INL meant higher price.

    I wonder how many engineers who specify 16-bit ADCs (or DACs) really understand what they are buying or what they need?  It seems if 16-bits meant 16-bits, there wouldn't be such a random spread on price, INL, etc.  The fact that there is, at least to me, means the decision process is filled with noise.

     

  3. TheMeasurementBlues
    August 16, 2013

    Doug, Thanks for your article. Here in the Boston Area, Bernie is a legend. You could write a book of Bernie stories, starting with the boxing gloves in his office.

    My first job out of college was in the module days with Analog Devices. In for first assignment, I was asked to compare hte schematics of two modules, one from ADI and one from Bernie (nobody ever use the name “Analogic,” it was alreays “Bernie). My manager wanted to know if the two moduels were schematically identical.

    After a time, I delcared that the who had the same schematic. I then asked “Who copied who?”

    “We copied them” was my manager's answer.

  4. goafrit2
    August 18, 2013

    >> Do any of them really qualify as 12-bit devices? I mean really .

    No – none of them will qualify. However, they are not wrong. In the specification of bits, everyone knows the key metrics that matter. The INR, DNR, etc. That is what people look at in ascertaining the performance of any D/A. That said, I think some D/As  do deliver 12-bit performance though they could have more bits than 12 bits.

  5. fasmicro
    August 18, 2013

    @Scott, I do not want to sound like a promoter of Analog Devices but I have of repect for their datasheet. They put a lot of efforts in getting that document right.

    >>I think the last time I saw an ADC specification that reflected the quoted resolution was the AD574 at 0.5LSB–a 30 year old part.>>

     

    The key in these measurement is how the firm functions. If the engineers and not the marketing team develop the datasheet and have the final responsbilities, you are sure to get factual numbers. Marketing can quote anything and justify it. Some will average the last few bits in ADC and give you a datasheet that looks high-performing until you put them in operations.

  6. fasmicro
    August 18, 2013

    Good point ->> It seems if 16-bits meant 16-bits, there wouldn't be such a random spread on price, INL, etc>> What is quoted most times is the readout register and not the actual performance of the system. Yet, any good engineer withe experience can see within the datasheet what matters most. If they report INR and DNR more than +/- O.5LSB, you cannot be talking about the quoted bits “as it is”.

  7. Vishal Prajapati
    August 19, 2013

    I have worked on ADE7758 IC from Analog Devices. I have seen that the internal result of ADC was some 52 bits long but final result register is jut 24 bits. So, there must be some averaging or some linearizing circuit present which end up giving just 24 bit result from 52 bits.

  8. SunitaT
    August 20, 2013

    There are a multitude of effective ways to layout out systems with 12-bit Analog-to-Digital (A/D) Converters and each layout is highly reliant on the number of devices in the circuit, the sorts of the devices (digital or analog) and the environment that the final product will reside in. Given all of these variables, it could easily be established that one successful lay out that provides twelve noise free bits from an analog signal may easily fail in another setting

  9. jkvasan
    August 26, 2013

    Sunita,

    You are absolutely right. PCB layout and component positioning near the ADC or DAC can affect the performance of the chip in question. PCB layout with proper definition of Analog and Digital Grounds can work better with the same chip which would otherwise perform badly.

  10. Brad_Albing
    August 26, 2013

    @JK – and that's another reason why those of us who are analog design engineers will always be employable — knowing the special tricks (like grounding) to maximize performance.

  11. jkvasan
    August 26, 2013

    @BA,

    Even for embedded engineers a clear understanding of grounds is necessary as mcus are increasingly becoming analog intensive. We could see separate ground pins for Analog, Digital, PLL, Vref, etc.

    My favorite pastime is to look at how PCB tracks are routed in evaluation boards. Many chip companies provide clear guidelines on this.

  12. Brad_Albing
    August 27, 2013

    @JK – I agree – I used to advise customers (previous job) that if they studied the Eval boards, they would get a clear idea on how the do their own PC board layout.

  13. jkvasan
    August 27, 2013

    @BA,

    It may be true for analog ic evaluation boards. However my experience with mixed signal mcu evaluation boards shows the PCB design was not the best of what is expected. The eval board designs were just that – allow evaluation of each of the features. However, if one would like to do some signal processing with an eval board, the experience could be far from pleasant.

  14. fasmicro
    August 30, 2013

    >> So, there must be some averaging or some linearizing circuit present which end up giving just 24 bit result from 52 bits.

    But that is how must commercial ADCs are being designed. You average the lower bits by have excess bits than required. But even the 25 bits you noted may actually be marketing bits. You get about 18-20 bits in ENOB and INR.

  15. fasmicro
    August 30, 2013

    >> it could easily be established that one successful lay out that provides twelve noise free bits from an analog signal may easily fail in another setting

    12 bit ADC may not need a lot of heroic design to realize. I have done that easily with pipelined ADC. Even SAR can give you a 12 bit system. The challenge comes when you move into the domain of 16 bits where the resolution becomes challenging limited by the opamp performance.

  16. fasmicro
    August 30, 2013

    @Sunita, PCB design is one of the most overlooked areas inexperienced project managers can fail. We spend all the time in college teaching ASIC, FPGA and system developments with limited time on PCB. Unfortunately, it is unlikely you can get any chip in operation without a board. So board design matters in the business.

  17. fasmicro
    August 30, 2013

    @Brad, yes, I agree that analog designers have more job security than digital ones. Generally, the disruption in digital is not always possible in analog as you cannot automate much in analog design

  18. fasmicro
    August 30, 2013

    >> My favorite pastime is to look at how PCB tracks are routed in evaluation boards. Many chip companies provide clear guidelines on this.

    There are some some nice applications notes from the likes of Analog Devices, Maxim and TI that can also help.

  19. Vishal Prajapati
    August 30, 2013

    If 50% of the ADC resolution is lost in the averaging then what about 10 and 12 bits ADCs built inside the Microcontrollers? Do they give only 6 bit resolution? That is not the case.

  20. fasmicro
    September 16, 2013

    >> So, there must be some averaging or some linearizing circuit present which end up giving just 24 bit result from 52 bits

    That must be truly excessive averaging. Getting 24 bits from 52 seems to be over the top. A good idea would have been to have the MSBs/the first ten and then use about another 32 to get the remaining. That way you save power and real estate on silicon.

  21. fasmicro
    September 16, 2013

    >> If 50% of the ADC resolution is lost in the averaging then what about 10 and 12 bits ADCs built inside the Microcontrollers? 

    The commenter might have pointed that when you average, you lose some effective bits of course.  Good designers average at the LSBs which do not have much impact in the system performance as the MSBs.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.