A single-ended DC voltmeter features a sample-and-difference front-end circuit.
ID: 2085656 • Letter: A
Question
A single-ended DC voltmeter features a sample-and-difference front-end circuit. We wish to use this meter to measure the differential offset voltage of a DUT's output buffer. Each of the two outputs is specified to be within a range of 3.5 V plusorminus 25 mV, and the differential offset is specified in the device data sheet to be plusorminus 15 mV. The meter input can be set to any of the following ranges: plusorminus 10 V, plusorminus 5 V. plusorminus 2 V, and plusorminus 1 V. The meter has a maximum error of 0.1% of its programmed range. The error includes all sources of inaccuracy (quantization error, linearity error, gain error, etc.). Compare the accuracy achieved using two simple DC measurements with the accuracy achieved using the sample-and difference circuit. Assume no errors due to nonrepeatability.Explanation / Answer
The simplest and the best way to measure offset using a single-ended DC voltmeter is to connect the meter to the OUTP output, measure its voltage and then connecting the meter to the OUTN output, measuring its voltage, and subtracting the second voltage from the first. Using this approach, we have to set the meter’s input range to ±10 V to accommodate the 1.35 V DUT output signals. Thus each measurement may have a quantization error of as much as ± ½ (20 V/212–1) = ± 2.44 mV. Therefore, the total error might be as high as ±4.88 mV, assuming that the quantization error from the first measurement is positive, while the quantization error from the second measurement is negative. Since the specification limit is ±5 mV, this will be an unacceptable test method. Using the sample-and-difference circuitry, we could range the meter input to the worst-case difference between the two outputs, which is 5 mV, assuming a good device. The lowest meter range that will accommodate a 5-mV signal is ±10 mV. However, we also need to be able to collect readings from bad devices for purposes of characterization. Therefore, we will choose a range of ±100 mV, giving us a compromise between accuracy and characterization flexibility. During the first phase of the sample-and-difference measurement, the voltage at the OUTN pin is sampled onto a holding capacitor internal to the meter. Then the meter is connected to the OUTP pin and the second phase of the measurement amplifies the difference between the OUTP voltage and the sampled OUTN voltage. Since the meter is set to a range of ±100 mV, a 100-mV difference between OUTP and OUTN will produce a full-scale 10-V input to the meter’s ADC. This serves to reduce the effects of the meter’s quantization error. The maximum error is given by ± ½ (100 mV/212–1) = ± 12.2 V. Again, the worst-case error is twice this amount, or ±24.4 V, which is well within the requirements of our measurement.
Related Questions
drjack9650@gmail.com
Navigate
Integrity-first tutoring: explanations and feedback only — we do not complete graded work. Learn more.