Article
Details
Citation
Johnston P, Nogueira K & Swingler K (2023) GMM-IL: Image Classification Using Incrementally Learnt, Independent Probabilistic Models for Small Sample Sizes. IEEE Access, 11, pp. 25492-25501. https://doi.org/10.1109/access.2023.3255795
Abstract
When deep-learning classifiers try to learn new classes through supervised learning, they exhibit catastrophic forgetting issues. In this paper we propose the Gaussian Mixture Model - Incremental Learner (GMM-IL), a novel two-stage architecture that couples unsupervised visual feature learning with supervised probabilistic models to represent each class. The key novelty of GMM-IL is that each class is learnt independently of the other classes. New classes can be incrementally learnt using a small set of annotated images with no requirement to relearn data from existing classes. This enables the incremental addition of classes to a model, that can be indexed by visual features and reasoned over based on perception. Using Gaussian Mixture Models to represent the independent classes, we outperform a benchmark of an equivalent network with a Softmax head, obtaining increased accuracy for sample sizes smaller than 12 and increased weighted F1 score for 3 imbalanced class profiles in that sample range. This novel method enables new classes to be added to a system with only access to a few annotated images of the new class.
Keywords
Task analysis; Visualization; Image classification; Probabilistic logic; Neural networks; Statistics; Gaussian mixture model
Journal
IEEE Access: Volume 11
Status | Published |
---|---|
Publication date | 31/12/2023 |
Publication date online | 19/03/2023 |
Date accepted by journal | 11/03/2023 |
URL | http://hdl.handle.net/1893/35184 |
Publisher | Institute of Electrical and Electronics Engineers (IEEE) |
eISSN | 2169-3536 |
People (2)
Lecturer, Computing Science
Professor, Computing Science