Loading…

Memory-efficient DRASiW Models

Weightless Neural Networks (WNN) are ideal for Federated Learning due to their robustness and computational efficiency. These scenarios require models with a small memory footprint and the ability to aggregate knowledge from multiple models. In this work, we demonstrate the effectiveness of using Bl...

Full description

Saved in:
Bibliographic Details
Published in:Neurocomputing (Amsterdam) 2024-12, Vol.610, p.128443, Article 128443
Main Authors: Napoli, Otávio Oliveira, de Almeida, Ana Maria, Borin, Edson, Breternitz, Mauricio
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Weightless Neural Networks (WNN) are ideal for Federated Learning due to their robustness and computational efficiency. These scenarios require models with a small memory footprint and the ability to aggregate knowledge from multiple models. In this work, we demonstrate the effectiveness of using Bloom filter variations to implement DRASiW models—an adaptation of WNN that records both the presence and frequency of patterns—with minimized memory usage. Across various datasets, DRASiW models show competitive performance compared to models like Random Forest, k-Nearest Neighbors, Multi-layer Perceptron, and Support Vector Machines, with an acceptable space trade-off. Furthermore, our findings indicate that Bloom filter variations, such as Count Min Sketch, can reduce the memory footprint of DRASiW models by up to 27% while maintaining performance and enabling distributed and federated learning strategies. •DRASiW has a competitive performance to machine learning models.•Counting-like bloom filters can be used as RAMs to reduce the memory footprint.•DRASiW can be aggregated allowing the implementation of Distributed and Federated Learning scenarios.
ISSN:0925-2312
DOI:10.1016/j.neucom.2024.128443