"Out of the box" active learning results on 12 of the remaining OMNIGLOT languages.

"Out of the box" active learning results on 12 of the remaining OMNIGLOT languages.

Source publication
Preprint
Full-text available
Modern deep learning requires large-scale extensively labelled datasets for training. Few-shot learning aims to alleviate this issue by learning effectively from few labelled examples. In previously proposed few-shot visual classifiers, it is assumed that the feature manifold, where classifier decisions are made, has uncorrelated feature dimensions...

Citations

... Meta agent teaming active learning [52] employs optimization-based methods to train the model, for the adaptation of an iteratively expanded labeled dataset during deployment. In contrast, Simple CNAPS [53], which is based on conditional neural adaptive processes, applies metric-based meta-learning methods, combining a neural adaptive few-shot classifier with a regularized Mahalanobisbased classifier to achieve powerful performance. In a different vein, Wanyan et al. [54] investigated multimodal complementarity for few-shot action recognition tasks, extracting multimodal representations for query samples and prototypes of different actions for support samples during each episode of metatraining. ...
Article
Full-text available
Active learning (AL) is an effective sample selection approach that annotates only a subset of the training data to address the challenge of data annotation, and deep learning (DL) is data-intensive and reliant on abundant training data. Deep active learning (DeepAL) benefits from the integration of AL and DL, offering an efficient solution that balances model performance and annotation costs. The importance of DeepAL has been increasingly recognized with the emergence of large foundation models that depend heavily on substantial computational resources and extensive training data. This survey endeavors to provide a comprehensive overview of DeepAL. Specifically, we first analyze and summarize various sample query strategies, data querying considerations, model training paradigms, and real-world applications of DeepAL. In addition, we discuss the challenges that arise in the era of foundation models and propose potential directions for future AL research. The survey aims to bridge a gap in the existing literature by organizing and summarizing current approaches, offering insights into DeepAL and highlighting the necessity of developing specialized DeepAL techniques tailored to foundation models. By critically examining the current state of DeepAL, this survey contributes to a more profound understanding of the field and serves as a guide for researchers and practitioners interested in DeepAL techniques.