Loading…
A digital-circuit-based evolutionary-computation algorithm for time-interleaved ADC background calibration
Evolutionary computation, learning theory, neural networks, and fuzzy logic, are just few of the disciplines known as computational intelligence. In today's science and technology, computational intelligence techniques are widely used. They make use of computers' storage-and-speed abilitie...
Saved in:
Main Authors: | , , |
---|---|
Format: | Conference Proceeding |
Language: | English |
Subjects: | |
Online Access: | Request full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Evolutionary computation, learning theory, neural networks, and fuzzy logic, are just few of the disciplines known as computational intelligence. In today's science and technology, computational intelligence techniques are widely used. They make use of computers' storage-and-speed abilities to address complex mathematical problems, which are difficult to be solved by conventional mathematical reasoning. In this paper, we introduce the design of a complex digital system implementing an evolutionary-computation algorithm to calibrate the mismatches affecting the performance of a time-interleaved Analog-to-Digital converter (TIADC). An error function (EF) is devised by modeling the three main issues limiting time-interleaved ADC performance: gain mismatches, offset mismatches and timing skews. The digital system is implemented on a Field-Programmable-Gate-Array (FPGA) and its digital logic and functionalities are tested by matching its simulation results against a Verilog-A behavioral model of the complete TIADC. |
---|---|
ISSN: | 2164-1706 |
DOI: | 10.1109/SOCC.2016.7905422 |