Loading…
Deep Learning Model Reuse and Composition in Knowledge Centric Networking
Machine learning has inadvertently pioneered the transition of big data into big knowledge. Machine learning models absorb and incorporate knowledge from large scale data through training and can be regarded as a representation of the knowledge learnt. There are multitude of use cases where this acq...
Saved in:
Main Authors: | , |
---|---|
Format: | Conference Proceeding |
Language: | English |
Subjects: | |
Online Access: | Request full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Machine learning has inadvertently pioneered the transition of big data into big knowledge. Machine learning models absorb and incorporate knowledge from large scale data through training and can be regarded as a representation of the knowledge learnt. There are multitude of use cases where this acquired knowledge can be used to enhance future applications or speed up the training of new models. Yet, the efficient sharing, exploitation and reusability of this knowledge remains a challenge. In this paper we propose a framework for deep learning models that facilitates the reuse of model architectures, transfer coefficients between models for knowledge composition and updates, and apply compression and pruning techniques for efficient storage and communication. We discuss the framework and its application in the context of Knowledge Centric Networking (KCN) and demonstrate the framework potential through various experiments, i.e. when knowledge has to be updated to accommodate new (raw) data or to reduce complexity. |
---|---|
ISSN: | 2637-9430 |
DOI: | 10.1109/ICCCN49398.2020.9209668 |