This procedure adjusts the 10 MHz internal frequency reference within minimal variance. This adjustment is generally done after the results from the Internal Frequency Reference Performance Test indicate that it is time for an adjustment. In this test the signal from a 10 MHz frequency standard is inserted into the RF input of the PSA. The instrument’s internal timebase is then adjusted by programming its DAC and doing a marker count on the displayed signal.
The specification for the 10 MHz reference accuracy is ± [(Time Since Last Adjustment x 1x10-7) + (Temperature Stability) + (Achievable Initial Calibration Accuracy)]. With this adjustment the first term is zero. At 20° to 30° C, the rest of the specification comes to ±0.8 Hz. The objective of this procedure is to bring the reference to within 0.1 Hz of 10 MHz.
A minimum warm-up time of 24 hours is required for minimal frequency reference drift.
|
This adjustment saves an output file which lists the corrections that were stored in the instrument. The output file is saved at this location:
|
Test Equipment |
Model Number |
---|---|
Frequency Standard |
Microsemi 5071A-C002 |
20 dB Fixed Attenuator |
8491A Option 020 |
Type-N Cable |
11500C |
Type-N (f) to BNC (m) adapter |
1250-1477 |
Type-N
(f) to 3.5 mm (f) adapter |
1250-1745 |
2.4 mm (f) to Type-N (f) adapter |
11903B |