Loading…
Channel linearity mismatch effects in time-interleaved ADC systems
A time-interleaved ADC system is an effective way to implement a high-sampling-rate ADC with relatively slow circuits. In the system, several channel ADCs operate at interleaved sampling times as if they were effectively a single ADC operating at a much higher sampling rate. However, mismatches amon...
Saved in:
Main Authors: | , , |
---|---|
Format: | Conference Proceeding |
Language: | English |
Subjects: | |
Online Access: | Request full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | A time-interleaved ADC system is an effective way to implement a high-sampling-rate ADC with relatively slow circuits. In the system, several channel ADCs operate at interleaved sampling times as if they were effectively a single ADC operating at a much higher sampling rate. However, mismatches among channel ADCs degrade S/N of the ADC system as a whole, and the effects of offset, gain and bandwidth mismatches as well as timing skew of the clocks distributed to the channels have been well investigated. This paper investigates the channel linearity mismatch effects in the time-interleaved ADC system, which are practically very important but have not been investigated yet. We consider two cases: differential nonlinearity mismatch and integral nonlinearity mismatch cases. Our numerical simulation shows their distinct features especially in frequency domain. The derived results can be useful for calibration algorithms to compensate for the channel mismatch effects. |
---|---|
DOI: | 10.1109/ISCAS.2001.921882 |