Task-free continual learning with expansion-based granular CNN: Gradual partitioning of manifolds in image stream classification

  • Daniel Leite

Research output: Contribution to journalArticlepeer-review

Abstract

This paper introduces an Expansion-based Granular Convolutional Neural Network (EG-CNN) framework for task-free continual learning in both domain-incremental (Domain-IL) and class-incremental (Class-IL) classification of image streams. EG-CNN comprises three components – a feature extractor, a refiner, and a granular classifier – each capable of independent adaptation over time. Unlike replay-based, regularization-based, or task-specific expansion methods that rely on task identifiers, multiple classifier heads, or episodic memory buffers, EG-CNN evolves a single classifier from scratch throughout lifelong learning, with no access to task boundaries or stored images. This design directly targets severe nonstationarity by addressing concept drift, class emergence, and inter-class interference through structural evolution rather than memory or task supervision. Guided dimensionality reduction is performed through geodesic and spectral refiners based on variants of Uniform Manifold Approximation and Projection (UMAP) and direct incremental Linear Discriminant Analysis (iLDA), informed by a Levina-Bickel maximum likelihood estimator of intrinsic manifold dimensionality. These refiners generate latent bottleneck representations, allowing a compact and interpretable Evolving Granular Neural Network classifier (EGNN-C2+) to be incrementally built at the CNN top. The framework supports gradual partitioning of the sufficient manifold, dynamic information fusion, dynamic deferral for uncertainty handling, and structural plasticity without catastrophic forgetting. Experiments in three streaming scenarios – nonstationary object recognition with CIFAR-10 (Domain-IL), scene recognition with missing classes in Intel-6 (Class-IL under stream-induced drift), and large-scale scene understanding with SUN-397 (Class-IL under compounded domain drift with 397 classes) – demonstrate stable structural evolution and sustained accuracy under increasing levels of novelty and nonstationarity.

Original languageEnglish
Article number132665
JournalNeurocomputing
Volume671
DOIs
StatePublished - 28 Mar 2026
Externally publishedYes

Keywords

  • Continual learning
  • Granular neural networks
  • Image stream classification
  • Intrinsic manifold dimension
  • Task-free learning

Fingerprint

Dive into the research topics of 'Task-free continual learning with expansion-based granular CNN: Gradual partitioning of manifolds in image stream classification'. Together they form a unique fingerprint.

Cite this