An adaptive technique for correcting timing errors in Time-Interleaved A/D Converter

Thumbnail Image
Date
2001-01-01
Authors
Mande, Geetanjali
Major Professor
Advisor
Committee Member
Journal Title
Journal ISSN
Volume Title
Publisher
Altmetrics
Abstract

In many communication systems, high speed ([greater than or equal to symbol]75 MS/s) and high resolution ([greater than or equal to symbol]12bits) A/D converters with low distortion and high dynamic range (e.g., SFDR[greater than or equal to symbol]75-90 dB) are usually required. Time Interleaving is a technique that can be used to increase the sampling rate of A/D converter. However, mismatches in the parallel A/D converter limit the performance of Time Interleaved architectures because of channel parameter mismatches (e.g., gain, offset, etc.) and channel sampling timing errors. These non-ideal effects introduce unwanted tones and noise, and hence, reduce the Spurious Free Dynamic Rage (SFDR) as well as the Signal-to-Noise Ratio (SNR). In this thesis, an adaptive technique is proposed to minimize such effects. It is based on the Least Mean Square (LMS) algorithm, which updates the control voltage in a way correcting the clock skew errors. A significant improvement in SFDR (over 50 85 dB) was achieved with great reduction in the digital hardware complexity but at the expense of slight increase in analog circuit complexity. In some cases, these A/D converters are used for sub-sampling operations such that the input frequency can be much higher than the Nyquist frequency. The most significant advantage for the proposed technique is that the improvement on SFDR was also obtained for sub-sampling operations. Finally, the proposed technique also worked well for any input signal with zero mean, along with the superimposed periodic signals.

Series Number
Journal Issue
Is Version Of
Versions
Series
Academic or Administrative Unit
Type
dissertation
Comments
Rights Statement
Copyright
Mon Jan 01 00:00:00 UTC 2001
Funding
Supplemental Resources
Source