I was at the doctor's office a few weeks ago, and as you'd expect, there was a row of those standard light boxes (see Figure 1) hanging along the office walls. But then I realized that all of them, except one, were covered with announcements, bulletins, and personal notes. In other words, these once-critical units were no longer needed; they had served their purpose, but their purpose had come to its end (although one light box was left clean and clear; I suppose it was for “just in case” situations).
We've all seen the ubiquitous visual cliché in movies and TV with a doctor putting the X-ray film onto a light box, to clearly explain to others the ailment (or cause of death), and so advance the plot line. But just as conventional cameras using film have been demolished by digital photography, the film-based X-ray is going the way of corded phones and the dial tone.
Why go digital? Assuming the costs are comparable to film-based X-rays, and meet the medically mandated imaging performance, digital X-ray imaging has multiple tangible advantages: no consumables (film and processing kits); no environmental (chemical) issues; instantaneous availability; lower X-ray exposure; potential for further processing to enhance images and reduce artifacts; images are easy to store, retrieve, and share; no re-filing, misplaced or lost films; and reduced need for physical storage space.
All this is possible largely due to the components and ICs, especially high-performance, highly integrated analog front ends (AFEs) which have become available in the past few years. In addition to the X-ray detector array at the very front end, and the video display at the user end, there's the signal-processing chain of the analog/digital (A/D) converter, processor (usually dedicated FPGAs), and memory, all supported by a sophisticated and complex power-management subsystem, of course. With highly integrated technology, all this can be packed in a modest, portable, battery-powered box — a real plus for mobility (bring it to the patient not vice versa) and use in remote or hostile situations.
The A/D converters take the weak signal from the detector array (comprised of photodiodes and transistors, somewhat similar to a CCD sensor) which can be configured as a single line array or a two-dimensional flat panel, depending on the machine and applications. While they do not need to operate at those blazing RF speeds and bandwidths, they must provide samples at a fairly high rate and resolution, and with very low nonlinearity (which affects gray-scale performance).
[Note : These systems also require D/A converters for various control functions, but the technical requirements are much less stringent.]
Although the specific numbers depend on the system front-end architecture and how the images are processed, a starting point for the converter is resolution of 12 to 18 bits, sampling times of several hundred nsec or less (>2 Msps), dynamic range of >100 dB, high-speed data transfer, and multichannel operation to save space and power. The required sampling rate may be somewhat higher or lower depending if the X-ray system is intended for static imaging (broken bones) versus dynamic imaging (cardiovascular studies).
Leading A/D converter vendors such as Analog Devices, Texas Instruments, and Maxim (to cite just a few) have targeted the digital X-ray market with devices which have the appropriate combination of specifications in key parameters which are unique to this application. This is achieved by packing multichannel front ends and converters into small, low-power devices with performance, which meets the complex and stringent application requirements.
I don't envy the situation of vendors of those traditional light boxes, but there's no holding back the technology in these cases — and digital imaging will only get better, as the components and algorithms (yes, it takes those, as well) get better. The integrated, application-specific AFE has done them in, just as the image sensor and ASIC has killed the venerable film-based camera for snapshots and professional picture-taking.
Are there any other historically long-lasting and significant technologies which integrated AFEs have “destroyed?” Even more intriguing: are there any applications where you don’t see that happening, at least for a long while?