Systematic investigation of model size effects relative to pretraining on forgetting
Investigate systematically how model size influences forgetting and Negative Backward Transfer in Vision-Language-Action models for continual robot learning, and determine how model size interacts with pretraining to affect resistance to forgetting.
References
While this serves an initial investigation, we will leave more systematic study of model size (in relation to pretraining) in future work.
— Pretrained Vision-Language-Action Models are Surprisingly Resistant to Forgetting in Continual Learning
(2603.03818 - Liu et al., 4 Mar 2026) in Appendix C (Study on Other Factors that Contribute to VLA's Continual Learning Behavior)