Loading…
Bag similarity network for deep multi-instance learning
The effectiveness of multi-instance learning (MIL) has been demonstrated by its wide spectrum of applications in computer vision, biometrics, and natural language processing. Recently, solving MIL problems using deep neural networks has proven to be highly effective. However, in current multi-instan...
Saved in:
Published in: | Information sciences 2019-12, Vol.504, p.578-588 |
---|---|
Main Authors: | , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | The effectiveness of multi-instance learning (MIL) has been demonstrated by its wide spectrum of applications in computer vision, biometrics, and natural language processing. Recently, solving MIL problems using deep neural networks has proven to be highly effective. However, in current multi-instance neural networks, the feature representation of each bag is learned individually, and the relations between bags are not considered. In this study, we propose a novel neural network for MIL that emphasizes modeling the affinities between bags. It achieves a more effective bag representation than previous methods. Specifically, a bag with multiple instances is modeled by its similarity to other bags, and the similarity calculation is carried out in a novel neural network, termed the bag similarity network (BSN). Training the BSN involves two representation learning problems: instance feature learning and bag similarity learning. To avoid the complex interdependence of these problems, we decouple the BSN training process by first training an instance feature learning network, and then construct a bag similarity network, each of which is optimized end-to-end by back-propagation. Experiments are conducted to demonstrate clearly the advantage of the proposed method over other state-of-the-art methods on various MIL datasets. |
---|---|
ISSN: | 0020-0255 1872-6291 |
DOI: | 10.1016/j.ins.2019.07.071 |