How do you calibrate your Test Equipment Calibrator?

We use a calibrated HP34401A to measure the AC and DC voltages and currents. The frequency outputs are measured using a HP53131A Frequency Counter with an external 10 MHZ time base provided by a Leo Bodnar GPS receiver seeing 6 or more satellites. The resistors and capacitors are measured with a calibrated Stanford Research Systems SR720 LCR meter.

Why do I measure about two volts of DC on the 1.000 Volt AC output?

Since we only have one power supply the AC is created by Operational Amplifiers that are connected from ground to the positive power supply voltage. We force the OP Amps to operate at two volts above ground to ensure that the output is a clean AC signal. We could have forced the AC output to be a ground level with a capacitor and a resistor but if we did that the output impedance would be high and your voltmeter might load down the AC voltage output. Switch your voltmeter to AC and read the 1.000 Volt AC voltage.

Is your AC voltage and current out a true sinewave?

Yes it is. We take the 1000 HZ square wave output and convert it to a pure sinewave using a special filter. The Total Harmonic Distortion + Noise (THD+N) is checked on each unit using a HP4333A Distortion Analyzer. The distortion needs to be better than -51 DB (or 0.3%) on each unit.

Will the resistor or capacitor values change with temperature?

We are using NPO capacitors, and they should drift less than 30 ppm per degree Centigrade. The resistors should drift less than 10 ppm per degree Centigrade. The room temperature that they were tested at is listed on the sticker on the bottom of the unit.