Papers
Topics
Authors
Recent
Search
2000 character limit reached

Resilient Practical Test-Time Adaptation: Soft Batch Normalization Alignment and Entropy-driven Memory Bank

Published 26 Jan 2024 in cs.LG | (2401.14619v1)

Abstract: Test-time domain adaptation effectively adjusts the source domain model to accommodate unseen domain shifts in a target domain during inference. However, the model performance can be significantly impaired by continuous distribution changes in the target domain and non-independent and identically distributed (non-i.i.d.) test samples often encountered in practical scenarios. While existing memory bank methodologies use memory to store samples and mitigate non-i.i.d. effects, they do not inherently prevent potential model degradation. To address this issue, we propose a resilient practical test-time adaptation (ResiTTA) method focused on parameter resilience and data quality. Specifically, we develop a resilient batch normalization with estimation on normalization statistics and soft alignments to mitigate overfitting and model degradation. We use an entropy-driven memory bank that accounts for timeliness, the persistence of over-confident samples, and sample uncertainty for high-quality data in adaptation. Our framework periodically adapts the source domain model using a teacher-student model through a self-training loss on the memory samples, incorporating soft alignment losses on batch normalization. We empirically validate ResiTTA across various benchmark datasets, demonstrating state-of-the-art performance.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (34)
  1. Parameter-free online test-time adaptation. In CVPR, pages 8344–8353, 2022.
  2. Contrastive test-time adaptation. In CVPR, pages 295–305, 2022.
  3. Test-time fast adaptation for dynamic scene deblurring via meta-auxiliary learning. In CVPR, pages 9137–9146, 2021.
  4. To adapt or not to adapt? real-time adaptation for semantic segmentation. In ICCV, pages 16548–16559, 2023.
  5. Robustbench: a standardized adversarial robustness benchmark. In Neurips, 2021.
  6. Imagenet: A large-scale hierarchical image database. In CVPR, pages 248–255, 2009.
  7. Robust mean teacher for continual and gradual test-time adaptation. In CVPR, pages 7704–7714, 2023.
  8. Domain-adversarial training of neural networks. Journal of Machine Learning Research, 17(59):1–35, 2016.
  9. Robust continual test-time adaptation: Instance-aware BN and prediction-balanced memory. In NeurIPS, 2022.
  10. Benchmarking neural network robustness to common corruptions and perturbations. In ICLR, 2019.
  11. Test-time adaptation via self-training with nearest neighbor information. In ICLR, 2023.
  12. Overcoming catastrophic forgetting in neural networks. Proceedings of the National Academy of Sciences, 114(13):3521–3526, 2017.
  13. Learning multiple layers of features from tiny images. 2009.
  14. Dong-Hyun Lee et al. Pseudo-label: The simple and efficient semi-supervised learning method for deep neural networks. In Workshop on challenges in representation learning, ICML, volume 3, page 896, 2013.
  15. Transferable semantic augmentation for domain adaptation. In CVPR, pages 11516–11525, June 2021.
  16. Ttt++: When does self-supervised test-time training fail or thrive? In NeurIPS, volume 34, 2021.
  17. Transferable representation learning with deep adaptation networks. IEEE Transactions on Pattern Analysis and Machine Intelligence, 41(12):3071–3085, 2019.
  18. Evaluating prediction-time batch normalization for robustness under covariate shift. arXiv preprint arXiv:2006.10963, 2020.
  19. Efficient test-time model adaptation without forgetting. In ICML, pages 16888–16905, 2022.
  20. Towards stable test-time adaptation in dynamic wild world. In ICLR, 2023.
  21. Source-free domain adaptation via avatar prototype generation and adaptation. In IJCAI, pages 2921–2927. International Joint Conferences on Artificial Intelligence Organization, 2021.
  22. Maximum classifier discrepancy for unsupervised domain adaptation. In CVPR, June 2018.
  23. Improving robustness against common corruptions by covariate shift adaptation. In NeurIPS, 2020.
  24. Learning to adapt structured output space for semantic segmentation. In CVPR, June 2018.
  25. Simultaneous deep transfer across domains and tasks. In ICCV, December 2015.
  26. Cédric Villani et al. Optimal transport: old and new, volume 338. Springer, 2009.
  27. Tent: Fully test-time adaptation by entropy minimization. In ICLR, 2021.
  28. Continual test-time domain adaptation. In CVPR, pages 7191–7201, 2022.
  29. Aggregated residual transformations for deep neural networks. In CVPR, pages 5987–5995, 2017.
  30. Sepico: Semantic-guided pixel contrast for domain adaptive semantic segmentation. IEEE Transactions on Pattern Analysis and Machine Intelligence, 45(7):9004–9021, 2023.
  31. Larger norm more transferable: An adaptive feature norm approach for unsupervised domain adaptation. In ICCV, October 2019.
  32. Robust test-time adaptation in dynamic scenarios. In CVPR, pages 15922–15932, 2023.
  33. Wide residual networks. In BMVC, 2016.
  34. Unsupervised domain adaptation for semantic segmentation via class-balanced self-training. In ECCV, September 2018.
Citations (1)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 1 tweet with 0 likes about this paper.