Loading…

A deep learning scheme for efficient multimedia IoT data compression

Multimedia Internet of Things (MIoT) devices and networks will face many power and communication overhead constraints given the volume of multimedia sensed data. One classic approach to overcoming the difficulty of large-scale data is to use lossy compression. However, current lossy compression algo...

Full description

Saved in:
Bibliographic Details
Published in:Ad hoc networks 2023-01, Vol.138, p.102998, Article 102998
Main Authors: Noura, Hassan N., Azar, Joseph, Salman, Ola, Couturier, Raphaël, Mazouzi, Kamel
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Multimedia Internet of Things (MIoT) devices and networks will face many power and communication overhead constraints given the volume of multimedia sensed data. One classic approach to overcoming the difficulty of large-scale data is to use lossy compression. However, current lossy compression algorithms require a limited compression rate to maintain acceptable perceived image quality. This is commonly referred to as the image quality-compression ratio trade-off. Motivated by current breakthroughs in computer vision, this article proposes recovering high-quality decompressed images at the application server level using a deep learning-based super-resolution model. As a result, this paper proposes ignoring the trade-off between image quality and size and increasing the reduction size further by using a lossy compressor with downscaling to conserve energy. The experimental study demonstrates that the proposed technique effectively improves the visual quality of compressed and downscaled images. The proposed solution was evaluated on resource-constrained microcontrollers. The obtained results show that the transmission latency and energy consumption can be decreased by up to 10% compared to conventional lossy compression techniques.
ISSN:1570-8705
1570-8713
DOI:10.1016/j.adhoc.2022.102998