Loading…
Performance evaluation of edge-computing platforms for the prediction of low temperatures in agriculture using deep learning
The Internet of Things (IoT) is driving the digital revolution. AlSome palliative measures aremost all economic sectors are becoming “Smart” thanks to the analysis of data generated by IoT. This analysis is carried out by advance artificial intelligence (AI) techniques that provide insights never be...
Saved in:
Published in: | The Journal of supercomputing 2021, Vol.77 (1), p.818-840 |
---|---|
Main Authors: | , , , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | The Internet of Things (IoT) is driving the digital revolution. AlSome palliative measures aremost all economic sectors are becoming “Smart” thanks to the analysis of data generated by IoT. This analysis is carried out by advance artificial intelligence (AI) techniques that provide insights never before imagined. The combination of both IoT and AI is giving rise to an emerging trend, called AIoT, which is opening up new paths to bring digitization into the new era. However, there is still a big gap between AI and IoT, which is basically in the computational power required by the former and the lack of computational resources offered by the latter. This is particularly true in rural IoT environments where the lack of connectivity (or low-bandwidth connections) and power supply forces the search for “efficient” alternatives to provide computational resources to IoT infrastructures without increasing power consumption. In this paper, we explore edge computing as a solution for bridging the gaps between AI and IoT in rural environment. We evaluate the training and inference stages of a deep-learning-based precision agriculture application for frost prediction in modern Nvidia Jetson AGX Xavier in terms of performance and power consumption. Our experimental results reveal that cloud approaches are still a long way off in terms of performance, but the inclusion of GPUs in edge devices offers new opportunities for those scenarios where connectivity is still a challenge. |
---|---|
ISSN: | 0920-8542 1573-0484 |
DOI: | 10.1007/s11227-020-03288-w |