Deviation from Linear Phase


Deviation from linear phase is a measure of phase distortion. The electrical delay feature of the analyzer is used to remove the linear portion of the phase shift from the measurement. This results in a high-resolution display of the non-linear portion of the phase shift (deviation from linear phase).

See also Comparing the Analyzer Delay Functions

See other Tutorials

What Is Linear Phase Shift?

Phase shift occurs because the wavelengths that occupy the electrical length of the device get shorter as the frequency of the incident signal increases. Linear phase-shift occurs when the phase response of a device is linearly proportional to frequency. Displayed on the analyzer, the phase-versus-frequency measurement trace of this ideal linear phase shift is a straight line. The slope is proportional to the electrical length of the device. Linear phase shift is necessary (along with a flat magnitude response) for distortionless transmission of signals.

What Is Deviation from Linear Phase?

In actual practice, many electrical or electronic devices will delay some frequencies more than others, creating non-linear phase-shift (distortion in signals consisting of multiple-frequency components). Measuring deviation from linear phase is a way to quantify this non-linear phase shift.

Since it is only the deviation from linear phase which causes phase distortion, it is desirable to remove the linear portion of the phase response from the measurement. This can be accomplished by using the electrical delay feature of the analyzer to mathematically cancel the electrical length of the device under test. What remains is the deviation from linear phase, or phase distortion.

Why Measure Deviation from Linear Phase?

The deviation from linear phase measurement accomplishes the following:

Using Electrical Delay

The electrical delay feature is the electronic version of the mechanical "line stretcher" of earlier analyzers. This feature does the following:

Learn how to set Electrical Delay.

Accuracy Considerations

The frequency response of the test setup is the dominant error in a deviation from linear phase measurement. To reduce this error, perform a 2-port measurement calibration.

How to Measure Deviation from Linear Phase:

  1. Preset the analyzer.

  2. If your device under test is an amplifier, it may be necessary to adjust the analyzer's source power:

  3. Connect the device under test as shown in the following graphic.

  1. Select an S21 measurement.

  2. Select the settings for your device under test, including the following:

  3. Remove the device and perform a calibration.

  4. Reconnect the device.

  5. Scale the displayed measurement for optimum viewing.

  6. Create a marker in the middle of the trace.

  7. Press Marker > Marker -> Functions > Marker -> Delay to invoke the Marker to Electrical Delay function. This flattens the phase trace.

  8. If desired, on the Scale menu, click Electrical Delay to fine-tune the flatness of the phase trace.

  9. Use the markers to measure the maximum peak-to-peak deviation from linear phase.

  10. Print the data or save it to a disk.