DREAM-C2L: Continual Learning Framework
An open-source framework for continual learning research, designed for reproducibility and scalability on HPC clusters.
What Is Continual Learning?
Traditional neural networks suffer from catastrophic forgetting — when trained on new data, they lose performance on previously learned tasks. Continual learning aims to solve this: how can a model learn new things without forgetting old ones?
The DREAM Framework
DREAM-C2L (Difficulty-aware REplay And Memory for Curriculum-to-Lifelong learning) introduces a principled approach to ordering and replaying training samples.
Core Ideas
- Curriculum-aware sample ordering: Instead of random shuffling, order training examples by difficulty. Easy samples first builds a strong foundation; hard samples later refine decision boundaries.
- Replay buffer management: Maintain a balanced memory of past experiences, selected to maximize coverage of the learned distribution.
- Regularization: Constrain how much model weights can change when learning new tasks, preventing catastrophic forgetting.
Key Features
- Modular pipeline: Swap replay strategies, regularization methods, and difficulty metrics independently
- HPC-ready: Built-in SLURM job management, multi-GPU training, checkpointing
- Reproducible: Full experiment tracking with Weights & Biases integration
- PyTorch Lightning backbone: Clean training loops with automatic mixed precision
Research Applications
The framework supports multiple continual learning scenarios:
- Class-incremental: New classes appear over time
- Task-incremental: New tasks with explicit boundaries
- Domain-incremental: Same task, shifting data distributions
Results
Our difficulty-aware approach shows consistent improvements over random ordering baselines across CIFAR-100, TinyImageNet, and ImageNet-subset benchmarks.
The key insight: the order in which a model sees data matters as much as what data it sees. By presenting examples in a curriculum-informed order, the model builds more robust internal representations that resist forgetting.