Loading…

A New Deep Spiking Architecture for Reconstruction of Compressed data in Cognitive Radio Networks

Cognitive Radio (CR) offers a spectrum sharing solution to handle the massive amount of devices operating in the same spectrum. In this work a sub-Nyquist compressive sensing technique is proposed that allows secondary users to sense and utilize idle spectrum. Reconstruction of compressed sparse dat...

Full description

Saved in:
Bibliographic Details
Published in:IEEE access 2023-01, Vol.11, p.1-1
Main Authors: Amr, Reem, Zaher, Nawal A., Gasser, Safa M., Eldiasty, Sherif K.
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Cognitive Radio (CR) offers a spectrum sharing solution to handle the massive amount of devices operating in the same spectrum. In this work a sub-Nyquist compressive sensing technique is proposed that allows secondary users to sense and utilize idle spectrum. Reconstruction of compressed sparse data is achieved through a dual stage sophisticated reconstruction algorithm. The reconstruction uses a classical fast Orthogonal Matching Persuit (OMP), followed by a new spiking deep Residual neural Network (ResNet) architecture. The proposed architecture is obtained through a novel distributed conversion technique that is proposed to convert deep architectures to a spiking neural networks. The reconstructed data is compared in terms of Peak Signal-to-Noise Ratio (PSNR), Mean Square Error (MSE) and Structural Similarity (SSIM) to the compressed data and the ground truth. Super Resolution Convolutional Neural Network (SRCNN) and a Deep ResNet are also used for reconstruction. The proposed algorithm outperforms SRCNN and the unconverted ResNet, specially at low Channel SNR (CSNR). In addition, the proposed algorithm results in a 68% reduction in both storage and energy requirements, which makes it suitable for implementation on User Equipment (UE).
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2022.3213816