Loading…
Acceleration of Gradient-Based Algorithms for Array Antenna Synthesis With Far-Field or Near-Field Constraints
This paper presents a technique for the acceleration of gradient-based algorithms that employ finite differences in the calculation of the gradient for the optimization of array antennas. It is based on differential contributions, which takes advantage of the fact that when an array is optimized, ea...
Saved in:
Published in: | IEEE transactions on antennas and propagation 2018-10, Vol.66 (10), p.5239-5248 |
---|---|
Main Authors: | , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | This paper presents a technique for the acceleration of gradient-based algorithms that employ finite differences in the calculation of the gradient for the optimization of array antennas. It is based on differential contributions, which takes advantage of the fact that when an array is optimized, each element is analyzed independently of the rest. Thus, the computation of the gradient of the cost function, which is typically the most time-consuming operation of the algorithm, can be accelerated. A time cost study is presented and the technique is implemented, as an example, in the generalized intersection approach algorithm for array optimization in near and far fields. Several syntheses are performed to assess the improvement of this technique. In the far field, it is compared with periodic and aperiodic arrays using different approaches for the computation of the gradient, including the analytic derivative. A reflectarray is also optimized in the near field with the goal of improving its quiet zone. The technique of differential contributions shows the important reductions in the time per iteration in all three syntheses, especially in that of aperiodic arrays and near-field optimization, where the time saved in the evaluation of the gradient is greater than 99%. |
---|---|
ISSN: | 0018-926X 1558-2221 |
DOI: | 10.1109/TAP.2018.2859915 |