Michael C. W. Coln

Learn More
—Self-calibration in approximately 10 000 conversions is demonstrated in a 16-bit, 1-MS/s algorithmic analog-to-digital converter (ADC). Continuous digital background calibration is enabled by introduction of a " split ADC " architecture, in which the die area of a single ADC design is split into two independent converters, each converting the same input(More)
The trend in submicron CMOS ADC design is toward all-digital self-calibrating architectures. Figure 15.1.1, a plot of conversions required for calibration versus resolution for previously reported background self-calibrating ADCs [1-5], shows that calibration of an N-bit converter by statistical means requires ≈ 2 2N conversions. Although this is adequate(More)
— The " Split ADC " architecture enables continuous digital background calibration by splitting the die area of a single ADC design into two independent halves, each converting the same input signal. The two independent outputs are averaged to produce the ADC output code. The difference of the two outputs provides information for a background calibration(More)
—The " Split ADC " architecture enables fully digital calibration and correction of nonlinearity errors due to capacitor mismatch in a Successive Approximation (SAR) ADC. The die area of a single ADC design is split into two independent halves, each converting the same input signal. Total area and power is unchanged, resulting in minimal increase in analog(More)
In the above paper [1], the authors present a " split ADC " architecture that is suitable for efficient/fast background digital calibration. The use of two parallel ADCs (thus the name " split ADC ") allows the input signal to be canceled by subtracting the two nominally equal output codes, thereby providing fast extraction of the calibration information.(More)
  • 1