Last updated: 2010-03-31
Various calibration routines must be run periodically to ensure measurement accuracy, source accuracy and allow the most robust demodulation of the device-under-test's signal. This document describes the various calibrations available in the test set and when they should be performed.
The First IQ Modulator calibration sends a signal from the Baseband Generator 1 module to the RF Source 1 module to optimize the phase accuracy of RF Source 1 as a function of frequency. The optimization data is then stored in the test set and accessed whenever the RF Source frequency is set. The Second IQ Modulator calibration performs a similar procedure for RF Source 2 (RF Source 2 is only present if the test set includes option 002: RF Source 2).
The IQ calibrations ensure that the test set accurately generates IQ-modulated signals. If the test set's IQ modulators have not been properly calibrated, the test set's output power (when producing an IQ-modulated signal) will be inaccurate, which may result in degraded receiver test results.
The IQ calibrations do not need to be run frequently. It is recommended that the calibration interval be 1 year.
To initiate the IQ calibrations from the front panel, press SYSTEM CONFIG, press the right More ( 1 of 2 ) key, select Service ( F7 ), then select Cal. first IQ Modulator or Cal. second IQ Modulator .
Each IQ calibration takes 5~6 minutes to execute.
This calibration is only required when using the test set to generate an AM or DSB-SC test signal. These test signals are used for calibrating Zero Intermediate Frequency (Zero IF) devices which utilize Qualcomm's radioOneâ„¢ architecture.
You do not have to perform the Burst Mod Offset 1 calibration for each application/format in your test set; performing it from one application/format provides calibration data to all relevant applications and formats.
The Burst Mod Offset 1 calibration does not need to be run frequently. It is recommended that the calibration interval be 1 year.
The Burst Mod Offset 1 calibration calibrates the noise and signal level of RF Source 1 (as a function of frequency) to most accurately produce the AM and DSB-SC test signals required for Zero IF device calibration. The calibration is performed by generating the modulated test signals and adjusting RF Source 1 to optimize gain and minimize the noise floor. The optimization data is then stored in the test set and accessed whenever the RF Source frequency is set.
To initiate the Burst Mod Offset 1 calibration from the front panel, press SYSTEM CONFIG, press the right More ( 1 of 2 ) key, select Service ( F7 ), then select Cal. Burst Mod Offset 1.
The Burst Mod Offset 1 calibration takes less than 1 minute to execute.
The Thermal Power Detector is calibrated at the factory and temperature controlled for stability. However, it is still possible for the auto-null circuit within the detector to drift over time. The Thermal Power Null Adjustment re-centers the auto-null circuit to provide optimum accuracy for the affected measurements.
You only need to perform the Thermal Power Null Adjustment once to re-center the auto-null circuit (you do not have to perform it from each application/format in your test set).
The Thermal Power Null Adjustment does not need to be run frequently. It is recommended that the calibration interval be 1 year.
To initiate the Thermal Power Null Adjustment from the front panel, press SYSTEM CONFIG, press the right More ( 2 of 2 ) key, and select Thermal Power Null Adjust ( F9 ).
The Thermal Power Null Adjustment takes between 30 seconds and 3 minutes.
The Spectrum Monitor Calibration should be run monthly, or whenever the test set's operating environment changes.
Performing the Spectrum Monitor calibration executes the same routine as performing the Calibrate Measurements routine.
To initiate the Spectrum Monitor Calibration from the front panel, press Instrument selection, select Spectrum Monitor , select Trigger Setup ( F4 ), then select Calibrate Measurement ( F11 ).
The Spectrum Monitor Calibration takes between 45 seconds and 3 minutes.
The Calibrate Measurements routine should be run monthly, or whenever the test set's operating environment changes. See Recommended Calibration Intervals.
The Calibrate Measurements routine calibrates the signal paths through the Measurement Downconverter and Demod Downconverter using both a CW signal and a modulated signal. The calibration is performed by comparing measured signals through the MDC and DDC to measurements made using the Fast Power Detector (for CW signals) and the Thermal Power Detector (for modulated signals). Both of these detectors are calibrated at the factory and temperature controlled for stability. During the Calibrate Measurements calibration, the internal temperature of the test set is measured and stored with the calibration data. If the internal temperature of the test set drifts by more than +/- 10 C since the last calibration, measurements utilizing this calibration will return integrity indicator 19 ( Uncalibrated Due to Temperature ).
The Calibrate Measurements routine provides calibration data for the following measurements:
To initiate the Calibrate Measurements routine from the front panel, initiate any of the measurements listed above and select Calibrate Measurements (F4) .
The Calibrate Measurements routine takes approximately 3 minutes to execute.
You can use the process shown in this chart to completely perform the user calibrations for the test set with Option 003.
GPIB Commands: CALibration