Loading…
NILM-LANN: A Lightweight Attention-based Neural Network in Non-Intrusive Load Monitoring
Non-Intrusive Load Monitoring (NILM) has attracted much attention as a promising method of identifying electrical appliances. It discriminates electrical devices based on changes in load characteristics to enable electrical device scheduling strategies for optimal energy utilization. Existing NILM m...
Saved in:
Main Authors: | , , , , |
---|---|
Format: | Conference Proceeding |
Language: | English |
Subjects: | |
Online Access: | Request full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Non-Intrusive Load Monitoring (NILM) has attracted much attention as a promising method of identifying electrical appliances. It discriminates electrical devices based on changes in load characteristics to enable electrical device scheduling strategies for optimal energy utilization. Existing NILM methods mainly use high-frequency electrical-specific signals and have high computational and memory requirements, which are difficult to implement on resource-constrained devices. Therefore, we propose NILM-LANN, a novel lightweight neural network with an attention mechanism, in which several tricks including Convolutional Long Short-Term Memory (ConvLSTM), 1-D convolutional operations and DenseNet are comprehensively utilized and balanced to reduce the number of model parameters, deepen the network while avoiding gradient explosion or vanishing. The Squeeze and Excitation block (SE-block) is also used to capture channel-wise dependencies based on the aggregated information. The NILM-LANN model is evaluated on public datasets of UK-DALE and LIT, and a self-built CAE dataset, with a maximum classification accuracy of 99.9% and F1-Score of 99.9%, while reducing the number of model parameters by over 90% compared to other existing methods such as Alex-Net and Light-LSTM Finally, the NILM-LANN is deployed in NVIDIA Jetson Nano B01 embedded device to verily its lightweight and efficiency. |
---|---|
ISSN: | 2768-1904 |
DOI: | 10.1109/CSCWD61410.2024.10580201 |