(Editor's note: Signal Chain Basics is an ongoing (and popular) series; you can click here for a complete, linked list of installments #1 through #55 of the series, and here for all installments.)
In previous articles we discussed clock jitter basics, including the components comprising total jitter and how jitter impacts the performance of high-speed links. In this installment we discuss the relationship between jitter and phase noise. This lays the foundation for future discussions on clocking data converters.
Time domain and frequency domain
Figure 1 depicts the nature of the information provided by measuring a signal in the time domain vs. the frequency domain. Both provide insight into the content of the signal and possible approaches to optimize the signal-to-noise ratio (SNR). It is important to understand where in the frequency spectrum the noise content of a signal resides because a system is susceptible to performance degradation due to noise within a specific bandwidth. Frequency domain measurements provide this insight.
Figure 1 : Time domain vs. frequency domain measurements.
What is phase noise?
Phase noise is a frequency domain measurement that is the power spectral density of a signal’s phase. To better understand the definition of phase noise, let’s consider how it is measured. Figure 2 shows a typical phase noise measurement setup in which a clock oscillator is connected to a spectrum analyzer.
Figure 2: Phase noise measurement.
A phase noise measurement has the following characteristics:
- The spectrum is considered symmetrical about the frequency (fC), therefore, only half (one side) of the spectrum is evaluated. This is called ‘single-sided’ phase noise.
- It is measured within a 1 Hz bandwidth. It is assumed that the power level is constant within this bandwidth. Therefore, phase noise is a power spectral density.
- It is measured relative to the signal’s power at frequency fC and is expressed in dBc/Hz.
- It is measured at various frequency offsets relative to the clock frequency. Sometimes datasheets record values at a few offsets, while others provide a phase noise plot as shown in Figure 2.
Phase noise and jitter
Figure 3 shows the formulas needed to convert phase noise to RMS jitter. RMS Phase Error is determined by calculating the area under the single-side band phase noise plot L(f); integrated betwewen two frequency limits of f1 and f2. These limits are not arbitrary as many of the integration limits are determined by the characteristics of the system being designed.
Figure 3: Phase noise to jitter calculation.
Once RMS phase error is determined, the value is scaled according to the second equation in Figure 3. It is important to note than whenever an RMS jitter value is referenced, the carrier frequency, the value in dBc/Hz, and the noise integration bandwidth all must be specified for the parameter to be meaningful.
By understanding random jitter and phase noise, you will be better prepared to explore how clock jitter impacts the performance of analog-to-digital (ADCs) and digital-to-analog converters (DACs), which we will cover at a future date.
Meanwhile, please join us next time when we will discuss RS232-to-RS485 converters used for industrial long-haul communication.
About the Author
is the Manager of Market Development and Systems Engineering for the Clocks and Timing Group of Texas Instruments. John has 30 years of experience in the electronics industry and has worked in the fields of product development, marketing, systems engineering, and business management. He holds a MSEE from Purdue University.