Phase Noise, Amplitude and TOI Measurement Errors

Article Reprints

Keysight Technologies

Phase Noise, Amplitude and TOI

Measurement Errors

Article Reprint

Phase Noise, Amplitude and TOI Measurement Errors

Bob Stern

Keysight Technologies Inc.

formerly Agilent Technologies electronic measurement business

Santa Rosa, Calif.

Do you rely on accurate measurement of phase noise, amplitude or third order intercept (TOI) as part of your work? Would it surprise you to find out the accuracy of these and other critical measurements may not be what you expected, depending upon how your instruments are calibrated? If you submit your instruments for calibration and just assume good things will happen, your carefully constructed system accuracy budget could be ruined by instruments operating out of specification! Sounds bad – and it is! Unfortunately, what constitutes a proper calibration and the importance to everyday measurement accuracy is seldom taught in electrical engineering classes. This article illustrates how the accuracy of key measurements is directly related to specific calibration deliverables.

State-of-the-art phase noise measurements generally have to be performed with a dedicated phase noise test set. While still the only viable method for truly low noise sources, this technique is time consuming and requires significant technician skill. Fortunately, the local oscillators employed in modern spectrum analyzers frequently have sufficiently low phase noise to allow direct measurement of source phase noise. Figure 1 illustrates the progression of improved phase noise performance in recently introduced spectrum analyzers.

The phase noise of each of these instruments is tested thoroughly when manufactured. Sometimes you may hear “phase noise performance is an intrinsic design characteristic and doesn’t need to be checked during periodic calibration.” Certainly, modern instruments do have stable block diagrams. And yes, instruments with synthesized local oscillators connected to 10 MHz external references do not need to be checked for frequency accuracy. However, many other performance characteristics, such as phase noise, may be stable for several years, then degrade without warning s and without obvious clues to instrument operation (think instrument heart disease).

Table 1 shows the phase noise specifications for the Keysight E4440A. Figure 2 shows the actual measured phase noise of a series of E4440A PSA units measured at the 100 Hz offset point. The test system in this case is one of the signal analyzer calibration stations at Keysight’s Roseville service center. Each symbol represents a different customer unit. Of course, the measurement is a combination of the reference source and the receiver performing the measurement. Note the sudden shift of approximately 11 dB beginning in January 2015. Either many units were suddenly not performing within the historical statistical process limits, or something happened with the reference source employed in this system. This service center has several such calibration stations, so experiments were performed with different reference signal sources and multiple measurements of the same test unit on multiple test stations. It was determined that the reference on this station was the source of the 11 dB degradation. As soon as the reference signal generator was replaced, subsequent test units were again measured within the historical process limits.1 The signal generator in question was in a rack, not moved or otherwise disturbed, prior to this 11 dB shift in performance. The root cause analysis of the failure mechanism had not yet been determined when this article was written.