TY - JOUR
T1 - Task-free continual learning with expansion-based granular CNN
T2 - Gradual partitioning of manifolds in image stream classification
AU - Leite, Daniel
N1 - Publisher Copyright:
© 2026 The Author(s)
PY - 2026/3/28
Y1 - 2026/3/28
N2 - This paper introduces an Expansion-based Granular Convolutional Neural Network (EG-CNN) framework for task-free continual learning in both domain-incremental (Domain-IL) and class-incremental (Class-IL) classification of image streams. EG-CNN comprises three components – a feature extractor, a refiner, and a granular classifier – each capable of independent adaptation over time. Unlike replay-based, regularization-based, or task-specific expansion methods that rely on task identifiers, multiple classifier heads, or episodic memory buffers, EG-CNN evolves a single classifier from scratch throughout lifelong learning, with no access to task boundaries or stored images. This design directly targets severe nonstationarity by addressing concept drift, class emergence, and inter-class interference through structural evolution rather than memory or task supervision. Guided dimensionality reduction is performed through geodesic and spectral refiners based on variants of Uniform Manifold Approximation and Projection (UMAP) and direct incremental Linear Discriminant Analysis (iLDA), informed by a Levina-Bickel maximum likelihood estimator of intrinsic manifold dimensionality. These refiners generate latent bottleneck representations, allowing a compact and interpretable Evolving Granular Neural Network classifier (EGNN-C2+) to be incrementally built at the CNN top. The framework supports gradual partitioning of the sufficient manifold, dynamic information fusion, dynamic deferral for uncertainty handling, and structural plasticity without catastrophic forgetting. Experiments in three streaming scenarios – nonstationary object recognition with CIFAR-10 (Domain-IL), scene recognition with missing classes in Intel-6 (Class-IL under stream-induced drift), and large-scale scene understanding with SUN-397 (Class-IL under compounded domain drift with 397 classes) – demonstrate stable structural evolution and sustained accuracy under increasing levels of novelty and nonstationarity.
AB - This paper introduces an Expansion-based Granular Convolutional Neural Network (EG-CNN) framework for task-free continual learning in both domain-incremental (Domain-IL) and class-incremental (Class-IL) classification of image streams. EG-CNN comprises three components – a feature extractor, a refiner, and a granular classifier – each capable of independent adaptation over time. Unlike replay-based, regularization-based, or task-specific expansion methods that rely on task identifiers, multiple classifier heads, or episodic memory buffers, EG-CNN evolves a single classifier from scratch throughout lifelong learning, with no access to task boundaries or stored images. This design directly targets severe nonstationarity by addressing concept drift, class emergence, and inter-class interference through structural evolution rather than memory or task supervision. Guided dimensionality reduction is performed through geodesic and spectral refiners based on variants of Uniform Manifold Approximation and Projection (UMAP) and direct incremental Linear Discriminant Analysis (iLDA), informed by a Levina-Bickel maximum likelihood estimator of intrinsic manifold dimensionality. These refiners generate latent bottleneck representations, allowing a compact and interpretable Evolving Granular Neural Network classifier (EGNN-C2+) to be incrementally built at the CNN top. The framework supports gradual partitioning of the sufficient manifold, dynamic information fusion, dynamic deferral for uncertainty handling, and structural plasticity without catastrophic forgetting. Experiments in three streaming scenarios – nonstationary object recognition with CIFAR-10 (Domain-IL), scene recognition with missing classes in Intel-6 (Class-IL under stream-induced drift), and large-scale scene understanding with SUN-397 (Class-IL under compounded domain drift with 397 classes) – demonstrate stable structural evolution and sustained accuracy under increasing levels of novelty and nonstationarity.
KW - Continual learning
KW - Granular neural networks
KW - Image stream classification
KW - Intrinsic manifold dimension
KW - Task-free learning
UR - https://www.scopus.com/pages/publications/105027254832
U2 - 10.1016/j.neucom.2026.132665
DO - 10.1016/j.neucom.2026.132665
M3 - Article
AN - SCOPUS:105027254832
SN - 0925-2312
VL - 671
JO - Neurocomputing
JF - Neurocomputing
M1 - 132665
ER -