Loading…

Design methodology for highly accurate approximate multipliers for error resilient applications

Approximate computing has become the paradigm shift for applications such as neural networks and image processing, where accurate computation is not needed and intends to improve area, power, and speed. New Multiplier architectures are proposed in this paper based on an algorithm which assigns the a...

Full description

Saved in:
Bibliographic Details
Published in:Computers & electrical engineering 2023-09, Vol.110, p.108798, Article 108798
Main Authors: Guturu, Sahith, Kumar, Uppugunduru Anil, Bharadwaj, S. Vignesh, Ahmed, Syed Ershad
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Approximate computing has become the paradigm shift for applications such as neural networks and image processing, where accurate computation is not needed and intends to improve area, power, and speed. New Multiplier architectures are proposed in this paper based on an algorithm which assigns the appropriate approximate compressor adaptively from existing set of compressors to improve the accuracy in the corresponding partial product column. Experimental results prove that the existing unsigned multiplier architectures have less accuracy than the proposed designs. To quantify the performance of the proposed designs they were assessed using image processing and neural network applications. From the results, it can be deduced that the proposed architectures achieve up to 43% increase in PSNR when compared to the existing designs, with up to 14.3% increase in power consumption. [Display omitted] •A new methodology based on an algorithm has been proposed to form the multiplier.•Various multipliers designed using proposed methodology and existing 4:2 compressors.•Optimal design selected based on hardware and error analysis of existing designs.•The performance of the new design is evaluated through benchmarking applications.
ISSN:0045-7906
1879-0755
DOI:10.1016/j.compeleceng.2023.108798