Learning Equi-angular Representations for Online Continual Learning
Abstract: Online continual learning suffers from an underfitted solution due to insufficient training for prompt model update (e.g., single-epoch training). To address the challenge, we propose an efficient online continual learning method using the neural collapse phenomenon. In particular, we induce neural collapse to form a simplex equiangular tight frame (ETF) structure in the representation space so that the continuously learned model with a single epoch can better fit to the streamed data by proposing preparatory data training and residual correction in the representation space. With an extensive set of empirical validations using CIFAR-10/100, TinyImageNet, ImageNet-200, and ImageNet-1K, we show that our proposed method outperforms state-of-the-art methods by a noticeable margin in various online continual learning scenarios such as disjoint and Gaussian scheduled continuous (i.e., boundary-free) data setups.
- Task-free continual learning. In CVPR, 2019.
- Verse: Virtual-gradient aware streaming lifelong learning with anytime inference. arXiv preprint arXiv:2309.08227, 2023.
- Rainbow memory: Continual learning with a memory of diverse samples. In CVPR, 2021.
- Class-incremental continual learning into the extended der-verse. IEEE TPAMI, 45(5):5497–5512, 2022.
- Dark experience for general continual learning: a strong, simple baseline. In NeurIPS, 2020.
- New insights on reducing abrupt representation change in online continual learning. In ICLR, 2021.
- On anytime learning at macroscale. In CoLLAs. PMLR, 2022.
- Online continual learning with natural distribution shifts: An empirical study with visual data. In ICCV, 2021.
- Riemannian walk for incremental learning: Understanding forgetting and intransigence. In ECCV, 2018.
- A simple framework for contrastive learning of visual representations. In ICML. PMLR, 2020.
- Superposition of many models into one. In NeurIPS, 2019.
- Counter-examples generation from a positive unlabeled image dataset. In CVPR. Elsevier, 2020.
- Online bias correction for task-free continual learning. In ICLR, 2023.
- Randaugment: Practical automated data augmentation with a reduced search space. In Proceedings of the IEEE/CVF conference on computer vision and pattern recognition workshops, pages 702–703, 2020.
- Continual learning of generative models with limited data: From wasserstein-1 barycenter to adaptive coalescence. IEEE TNNLS, 2023.
- Continual learning beyond a single model. arXiv preprint arXiv:2202.09826, 2022.
- Unsupervised visual representation learning by context prediction. In Proceedings of the IEEE international conference on computer vision, pages 1422–1430, 2015.
- Exploring deep neural networks via layer-peeled model: Minority collapse in imbalanced training. PNAS, 118(43):e2103091118, 2021.
- Self-supervised representation learning by rotation feature decoupling. In CVPR, 2019.
- Equiangular tight frames that contain regular simplices. Linear Algebra and its applications, 555:98–138, 2018.
- Real-time evaluation in online continual learning: A new hope. In CVPR, 2023.
- Unsupervised representation learning by predicting image rotations. In ICLR, 2018.
- Generative negative replay for continual learning. Neural Networks, 162:369–383, 2023.
- Remind your neural network to prevent catastrophic forgetting. In ECCV. Springer, 2020.
- Deep residual learning for image recognition. In CVPR, 2016.
- Pooling revisited: Your receptive field is suboptimal. In CVPR, 2022.
- An unconstrained layer-peeled perspective on neural collapse. In ICLR, 2021.
- Learning imbalanced datasets with maximum margin loss. In IEEE ICIP. IEEE, 2021.
- Continual learning on noisy data streams via self-purified replay. In Proceedings of the IEEE/CVF international conference on computer vision, pages 537–547, 2021.
- Continual learning based on ood detection and task masking. In CVPR, 2022.
- Objectmix: Data augmentation by copy-pasting objects in videos for action recognition. In Proceedings of the 4th ACM International Conference on Multimedia in Asia, pages 1–7, 2022.
- Adam: A method for stochastic optimization. In International Conference on Learning Representations, San Diego, CA, USA, 2015.
- Overcoming catastrophic forgetting in neural networks. PNAS, 114(13):3521–3526, 2017.
- Online continual learning on class incremental blurry task configuration with anytime inference. In ICLR, 2022.
- Online boundary-free continual learning by scheduled data prior. In The Eleventh International Conference on Learning Representations, 2023.
- Sagemix: Saliency-guided mixup for point clouds. Advances in Neural Information Processing Systems, 35:23580–23592, 2022.
- Regularization shortcomings for continual learning. arXiv preprint arXiv:1912.03049, 2019.
- Learning without forgetting. IEEE TPAMI, 40(12):2935–2947, 2017.
- Neural collapse with cross-entropy loss. arXiv preprint arXiv:2012.08465, 2020.
- Supervised contrastive replay: Revisiting the nearest class mean classifier in online class-incremental continual learning. In CVPRW, 2021.
- Neural collapse with unconstrained features. arXiv preprint arXiv:2011.11619, 2020.
- Sergey Pankov. Learning image transformations without training examples. In ISVC. Springer, 2011.
- Prevalence of neural collapse during the terminal phase of deep learning training. PNAS, 117(40):24652–24663, 2020.
- Continual lifelong learning with neural networks: A review. Neural networks, 113:54–71, 2019.
- Latent replay for real-time continual learning. In IROS. IEEE, 2020.
- Few-shot class-incremental learning from an open-set perspective. In ECCV, pages 382–397. Springer, 2022.
- Class-incremental learning with pre-allocated fixed classifiers. In ICPR. IEEE, 2021.
- Continual learning with invertible generative models. Neural Networks, 164:606–616, 2023.
- Gdumb: A simple approach that questions our progress in continual learning. In ECCV. Springer, 2020.
- Neural collapse in deep homogeneous classifiers and the role of weight decay. In ICASSP. IEEE, 2022.
- icarl: Incremental classifier and representation learning. In CVPR, 2017.
- Experience replay for continual learning. In NeurIPS, 2019.
- Progressive neural networks. arXiv preprint arXiv:1606.04671, 2016.
- Budgeted online continual learning by adaptive layer freezing and frequency-based sampling. 2023.
- Encoders and ensembles for task-free continual learning. arXiv preprint arXiv:2105.13327, 2021.
- Continual learning with deep generative replay. In NeurIPS, 2017.
- Negative data augmentation. arXiv preprint arXiv:2102.05113, 2021.
- Extended unconstrained features model for exploring deep neural collapse. In ICML. PMLR, 2022.
- Temporal alignment of human motion data: A geometric point of view. In GSI. Springer, 2023.
- Resmooth: Detecting and utilizing ood samples when training with data augmentation. IEEE TNNLS, 2022a.
- Learning to prompt for continual learning. In CVPR, 2022b.
- Stephan Wojtowytsch. On the emergence of simplex symmetry in the final and penultimate layers of neural network classifiers. In MSML, pages 1–21. PMLR, 2021.
- Ou Wu. Rethinking class imbalance in machine learning. arXiv preprint arXiv:2305.03900, 2023.
- Large scale incremental learning. In CVPR, pages 374–382, 2019.
- Inducing neural collapse in imbalanced learning: Do we really need a learnable classifier at the end of deep neural network? In NeuIPS, 2022.
- Neural collapse inspired feature-classifier alignment for few-shot class incremental learning. In ICLR, 2023a.
- Resmem: Learn what you can and memorize the rest. arXiv preprint arXiv:2302.01576, 2023b.
- Ganrec: A negative sampling model with generative adversarial network for recommendation. Expert Systems with Applications, 214:119155, 2023c.
- Task-free continual learning via online discrepancy distance learning. In NeurIPS, 2022.
- Online coreset selection for rehearsal-based continual learning. In ICLR, 2021.
- Cutmix: Regularization strategy to train strong classifiers with localizable features. In Proceedings of the IEEE/CVF international conference on computer vision, pages 6023–6032, 2019.
- Energy aligning for biased models. arXiv preprint arXiv:2106.03343, 2021.
- Understanding imbalanced semantic segmentation through neural collapse. In CVPR, 2023.
- A model or 603 exemplars: Towards memory-efficient class-incremental learning. In ICLR, 2022a.
- On the optimization landscape of neural collapse under mse loss: Global optimality with unconstrained features. In ICML. PMLR, 2022b.
- A geometric analysis of neural collapse with unconstrained features. In NeurIPS, 2021.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.