Loading…
SHiFT: an efficient, flexible search engine for transfer learning
Transfer learning can be seen as a data- and compute-efficient alternative to training models from scratch. The emergence of rich model repositories, such as TensorFlow Hub, enables practitioners and researchers to unleash the potential of these models across a wide range of downstream tasks. As the...
Saved in:
Published in: | Proceedings of the VLDB Endowment 2022-10, Vol.16 (2), p.304-316 |
---|---|
Main Authors: | , , , , , |
Format: | Article |
Language: | English |
Citations: | Items that this one cites |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Transfer learning can be seen as a data- and compute-efficient alternative to training models from scratch. The emergence of rich model repositories, such as TensorFlow Hub, enables practitioners and researchers to unleash the potential of these models across a wide range of downstream tasks. As these repositories keep growing exponentially, efficiently selecting a good model for the task at hand becomes paramount. However, a single generic search strategy (e.g., taking the model with the highest linear classifier accuracy) does not lead to optimal model selection for diverse downstream tasks. In fact, using hybrid or mixed strategies can often be beneficial. Therefore, we propose SHiFT, the first downstream task-aware, flexible, and efficient model search engine for transfer learning. Users interface with SHiFT using the SHiFT-QL query language, which gives users the flexibility to customize their search criteria. We optimize SHiFT-QL queries using a cost-based decision maker and evaluate them on a wide rang of tasks. Motivated by the iterative nature of machine learning development, we further support efficient incremental executions of our queries, which requires a special implementation when jointly used with our optimizations. |
---|---|
ISSN: | 2150-8097 2150-8097 |
DOI: | 10.14778/3565816.3565831 |