Look on any of the major analog manufacturers' websites and you'll find lots of 12-bit D/A converters. Do any of them really qualify as 12-bit devices? I mean really .
A little history is in order here.
The earliest commercially available D/A converters were plastic-packaged modules… typically 2 inches by 3 to 4 inches by ½ to 1 inch thick, containing a circuit board populated by discrete devices, sometimes hand-selected, or at best a few low-level ICs. They were big and bulky, but they were the only game in town. And many of them were 12-bit accurate over temperature, including gain, offset, linearity, and all other errors. Just to clarify, 1 least-significant-bit (LSB) at 12 bits is 1 part in 4096 or 0.024 percent.
One company producing such devices was Analogic, headed by a brilliant engineer and entrepreneur named Bernie Gordon. Analogic was making a lot of money selling these converter modules, and wanted to protect their market. Bernie waged a marketing campaign in the trade press against the inevitably lower-price monolithic D/A converters, telling anyone who would listen that it was not possible to produce a true 12-bit current-switching D/A converter in monolithic technology, and it would never be possible.
In the 1970s, various companies tried to produce D/A converters with 12-bit resolution and accuracy. In 1977, Donald Comer of Precision Monolithics, Incorporated (PMI, later acquired by Analog Devices) presented a paper at the ISSCC on a monolithic 12-bit D/A converter. The paper is available on the IEEE site if you are a member. In the corner of the chip, the metal mask included a sketch of a heart with a bite taken out of it, and the initials “B.G.” in the bite. The message was: “Eat your heart out, Bernie Gordon. We have done what you said was impossible.”
It did not take Bernie long to respond. In a letter to George Rostky, editor of Electronic Design magazine, the leading trade magazine of the day, Bernie made several points.
Reprinted with the kind permission of Analogic.
One was that the PMI D/A converter used an additional layer of thin-film resistors deposited on the chip, and that disqualified it from being a monolithic device. That was kind of a weak argument, since all ICs use a deposited layer of interconnect metal.
The next point was that the PMI D/A converter was not quite complete, needing a reference, an output amplifier for current-to-voltage conversion, and bypass capacitors. This was definitely true, and adding a 12-bit-accurate reference would require some discrete components and some kind of manufacturing adjustment.
However, Bernie made some valid points regarding temperature drift and long-term drift. Whether they were actually needed is arguable. Bernie's position was that if you needed a 12-bit accurate device at the time a system was sold, you probably also needed a 12-bit-accurate system a year or two later. Of course, most instruments are re-calibrated and performance-certified periodically to take out the biggest errors, but re-calibrating the linearity of a laser-trimmed or Zener-zap-trimmed D/A converter isn't really feasible.
As time went on, the price-performance ratio of 12-bit IC D/A converters was too attractive to resist, the market decided they were “good enough,” and shifted away from the more-expensive modules. Analogic turned its attention to developing and manufacturing systems (ironically, many of which included the much-maligned IC converters).
I looked through a few websites to see if anyone has yet made a “true” 12-bit D/A converter… guaranteed over temperature, time, and including all error sources. I didn't find any. Some are pretty close, at least hitting the linearity specs, and even holding them over temperature. Even 16-bit linearity is being met by some devices. Getting the full-scale accurate to 0.024 percent with an on-chip reference is a lot harder.
Will we ever get there? Was Bernie Gordon right in 1977?
Dan Sheingold (ex-Analog Devices) and Walt Kester of Analog Devices provided some fact-checking and filled in a few details for this post.