When discussing the integration of analog and RF into digital system-on-chips, it is the former two that usually emerge as the most important factors. However, changes in digital processing are shifting that focus. During the past several years, designers have been working to design simpler analog and RF systems by transferring resources and operations to the digital domain, particularly in the case of wireless basestation design.
In wireless, the transition from narrowband 1G systems (for example, AMPS was 30 kHz dedicated per user) to 2G (Global System for Mobile Communications shared 200 kHz among 8 time-division multiple-access slots) to 3G (5 MHz among many, many sessions) steadily reduces the need for analog filters by shifting filtering, channel selection and processing to the digital signal processor (DSP). In the past such a change was not viable, but the steady decline in $/Mips has made it economical.
While the increased performance in DSP assists the analog portions, the reverse is also true. Higher modulation makes the analog design more sensitive, for example, which in turn assists the digital processing.
Wideband code-division multiple access (W-CDMA) requires approximately 100 times the DSP performance of GSM. So, a design using GSM and supporting 64 users would require 8 transceiver units; that is, eight separate power amplifiers, expensive sets of analog filters for adjacent channel rejection, separate intermediate frequency (IF) & RF strips and separate data converters. In contrast, for W-CDMA to do the same requires only one RF system-with consequent reduction in cost. All the separation and channelization is done digitally.
The cost of a fully loaded basestation channel is shown (see Figure). In the early years, GSM was far cheaper, partly because the analog parts cost less than the digital equivalent. In time, however, the savings from Moore's Law for DSP drive down the 3G costs in a way that is not available to 2G, where the costs for analog components decline more slowly. As a result, the “cost per Erlang” of 3G is much lower than 2G. Incidentally, this is a major reason why the technology will succeed; new services are a bonus, but increased efficiency and reduced cost for voice are the “killer app.”
While developers are shaping new standards specifically to capitalize on the benefits of 3G, they can apply it to older standards, too. For example, while GSM is inherently a narrowband protocol, it is possible to deploy a multichannel system using a wideband approach, consolidating a number of channels into one using a wideband analog-to-digital converter (ADC) and digital-to-analog converter (DAC), a single RF stage and a multichannel power-amplifier (MCPA). Then, between the multichannel converter and the separate per-channel signal processing, a digital filtering stage will separate the distinct channels.
This implies that for a limited number of protocols, using dedicated, optimized logic is most efficient. As the number of standards rises, though, a flexible architecture will prove to be more economical. The breakeven is probably around three: In other words, for one or perhaps two different protocols, the cost of the all-digital approach may exceed a “more efficient” dedicated and part-analog system, but if you wish to support more (say, W-CDMA, GSM, Bluetooth and WiFi), then having a degree of programmability will prove more be efficient. Ultimately, there may be a requirement to support approximately 14 air interfaces, comprising different flavors of 2G and 3G, different wireless local-area network variants, Bluetooth, GPS, Wi-Max and the like-the endgame of “Radio Free Intel.”
In addition to ensuring multimode capability, a second argument exists for using flexible digital sections: The new standards are not static. Indeed, the greater complexity of designs inevitably means the standards are changing rapidly. In the W-CDMA world, in just four years we will have seen four versions of the standard. We've already seen prerelease 99 (Freedom of Mobile Multimedia Access, for example), formal Release 99, Release 4 and-being deployed next year-Release 5 with its hugely important high-speed downlink packet-access mode.
Similarly, in WiFi we have seen 802.11b, .11g and, soon to come, 11n (all with associated media access control-layer changes). Developing a more cost-efficient architecture is desirable, but only if it offers a usable life span-which means flexibility and the ability to be updated is mandatory.
Newer technologies like W-CDMA and orthogonal frequency division multiplexing share a couple of interesting properties. First, as discussed, they shift functionality from digital to analog. Second, the desire for improved performance and bandwidth efficiency tends to increase the complexity of the modulation-as a result, these protocols are far closer to Gaussian than the previous generation of constant envelope. This stems from the increased depth of modulation (from QPSK to 16 QAM or beyond), the use of multitone modulation or CDMA. However, there is a price to pay for this: While bandwidth efficiency increases, power efficiency drops.
For simple technologies like GSM and Bluetooth, you can obtain 40 percent efficiency in a power amplifier. For complex technologies, though, you need a very linear amplifier. (Low distortion is needed for complex modulation.) To achieve linearity, you must run the amplifier inefficiently. This situation is made worse by the high peak-to-average ratio, which requires that you “back off” the PA so it is always in its linear range-wasting more efficiency. In effect, you have traded efficiency in transmission (more bits per second per Hz per cell, enabling fewer basestations and higher rates) against circuit efficiency (burning lots more heat and power).
For 3G this efficiency could be as low as 3 percent, for a 20-watt power amplifier burning 700 W of heat. Heat causes failures; it needs air conditioning-the lack of which is a prime failure point-and you are paying for electricity that you don't actually want to use.
Traditionally, analog techniques have been used to address this need. Ultraprecise transistors with very careful matching and incredibly clever process techniques improve device linearity.
More recently, digital predistortion (DPD) has been used to replace inefficient techniques-boosting efficiency by 20 percent, possibly more. It works by allowing the PA to be nonlinear, so it can run more efficiently and closer to nonlinearity, but it adds DSP resources to model that nonlinearity, predicting it and “reversing it.” In effect, you input a deliberately “wonky” signal, but know that the nonlinearity will affect it to give the output desired all along.
Once more, you are using digital technology and DSP to replace careful analog parts and reduce costs (for example, using cheaper transistors as their geometries get smaller and Moore's increasing price performance).
About the author
Rupert Baines (email@example.com) is vice president of marketing at picoChip Designs Ltd. (San Francisco).