MetaConcept: Learn to Abstract via Concept Graph for Weakly-Supervised Few-Shot Learning


الملخص بالإنكليزية

Meta-learning has been proved to be an effective framework to address few-shot learning problems. The key challenge is how to minimize the generalization error of base learner across tasks. In this paper, we explore the concept hierarchy knowledge by leveraging concept graph, and take the concept graph as explicit meta-knowledge for the base learner, instead of learning implicit meta-knowledge, so as to boost the classification performance of meta-learning on weakly-supervised few-shot learning problems. To this end, we propose a novel meta-learning framework, called MetaConcept, which learns to abstract concepts via the concept graph. Specifically, we firstly propose a novel regularization with multi-level conceptual abstraction to constrain a meta-learner to learn to abstract concepts via the concept graph (i.e. identifying the concepts from low to high levels). Then, we propose a meta concept inference network as the meta-learner for the base learner, aiming to quickly adapt to a novel task by the joint inference of the abstract concepts and a few annotated samples. We have conducted extensive experiments on two weakly-supervised few-shot learning benchmarks, namely, WS-ImageNet-Pure and WS-ImageNet-Mix. Our experimental results show that 1) the proposed MetaConcept outperforms state-of-the-art methods with an improvement of 2% to 6% in classification accuracy; 2) the proposed MetaConcept can be able to yield a good performance though merely training with weakly-labeled data sets.

تحميل البحث