The search functionality is under construction.

IEICE TRANSACTIONS on Fundamentals

Channel Linearity Mismatch Effects in Time-Interleaved ADC Systems

Naoki KUROSAWA, Haruo KOBAYASHI, Kensuke KOBAYASHI

  • Full Text Views

    0

  • Cite this

Summary :

A time-interleaved ADC system is an effective way to implement a high-sampling-rate ADC with relatively slow circuits. In the system, several channel ADCs operate at interleaved sampling times as if they were effectively a single ADC operating at a much higher sampling rate. Mismatches among channel ADCs degrade SNR and SFDR of the ADC system as a whole, and the effects of offset, gain and bandwidth mismatches as well as timing skew of the clocks distributed to the channels have been well investigated. This paper investigates the channel linearity mismatch effects in the time-interleaved ADC system, which are very important in practice but had not been investigated previously. We consider two cases: differential nonlinearity mismatch and integral nonlinearity mismatch cases. Our numerical simulation shows distinct features of such mismatch especially in frequency domain. The derived results can be useful for deriving calibration algorithms to compensate for the channel mismatch effects.

Publication
IEICE TRANSACTIONS on Fundamentals Vol.E85-A No.4 pp.749-756
Publication Date
2002/04/01
Publicized
Online ISSN
DOI
Type of Manuscript
Special Section PAPER (Special Section of Selected Papers from the 14th Workshop on Circuits and Systems in Karuizawa)
Category

Authors

Keyword