Memory-Efficient Prompt Tuning for Incremental Histopathology Classification
Abstract: Recent studies have made remarkable progress in histopathology classification. Based on current successes, contemporary works proposed to further upgrade the model towards a more generalizable and robust direction through incrementally learning from the sequentially delivered domains. Unlike previous parameter isolation based approaches that usually demand massive computation resources during model updating, we present a memory-efficient prompt tuning framework to cultivate model generalization potential in economical memory cost. For each incoming domain, we reuse the existing parameters of the initial classification model and attach lightweight trainable prompts into it for customized tuning. Considering the domain heterogeneity, we perform decoupled prompt tuning, where we adopt a domain-specific prompt for each domain to independently investigate its distinctive characteristics, and one domain-invariant prompt shared across all domains to continually explore the common content embedding throughout time. All domain-specific prompts will be appended to the prompt bank and isolated from further changes to prevent forgetting the distinctive features of early-seen domains. While the domain-invariant prompt will be passed on and iteratively evolve by style-augmented prompt refining to improve model generalization capability over time. In specific, we construct a graph with existing prompts and build a style-augmented graph attention network to guide the domain-invariant prompt exploring the overlapped latent embedding among all delivered domains for more domain generic representations. We have extensively evaluated our framework with two histopathology tasks, i.e., breast cancer metastasis classification and epithelium-stroma tissue classification, where our approach yielded superior performance and memory efficiency over the competing methods.
- Memory aware synapses: Learning what (not) to forget. In ECCV, 139–154.
- From detection of individual metastases to classification of lymph node status at the patient level: the camelyon17 challenge. IEEE Transactions on medical imaging, 38(2): 550–560.
- Systematic analysis of breast cancer morphology uncovers stromal features associated with survival. Science translational medicine, 3(108): 108ra113–108ra113.
- Diagnostic assessment of deep learning algorithms for detection of lymph node metastases in women with breast cancer. JAMA, 318(22): 2199–2210.
- LifeLonger: A Benchmark for Continual Disease Classification. In MICCAI, 314–324. Springer.
- Don’t forget, there is more than forgetting: new metrics for Continual Learning. arXiv preprint arXiv:1810.13166.
- An image is worth 16x16 words: Transformers for image recognition at scale. arXiv preprint arXiv:2010.11929.
- What is Wrong with Continual Learning in Medical Image Segmentation? arXiv preprint arXiv:2010.11008.
- Harmofl: Harmonizing local and global drifts in federated learning on heterogeneous medical images. In AAAI, volume 36, 1087–1095.
- Predicting survival from colorectal cancer histology slides using deep learning: A retrospective multicenter study. PLoS medicine, 16(1): e1002730.
- Overcoming catastrophic forgetting in neural networks. PNAS, 114(13): 3521–3526.
- Domain generalization on medical imaging classification using episodic training with task augmentation. Computers in biology and medicine, 141: 105144.
- Domain-incremental Cardiac Image Segmentation with Style-oriented Replay and Domain-sensitive Feature Whitening. IEEE Transactions on Medical Imaging.
- FedBN: Federated Learning on Non-IID Features via Local Batch Normalization. In ICLR.
- Learn to grow: A continual structure learning framework for overcoming catastrophic forgetting. In ICML, 3925–3934. PMLR.
- Learning without forgetting. IEEE Transactions on Pattern Analysis and Machine Intelligence, 40(12): 2935–2947.
- Fast scannet: Fast and dense analysis of multi-gigapixel whole-slide images for cancer metastasis detection. IEEE Transactions on medical imaging, 38(8): 1948–1958.
- Identification of tumor epithelium and stroma in tissue microarrays using texture analysis. Diagnostic pathology, 7: 1–11.
- Feddg: Federated domain generalization on medical image segmentation via episodic learning in continuous frequency space. In CVPR, 1013–1023.
- An appraisal of incremental learning methods. Entropy, 22(11): 1190.
- Online continual learning in image classification: An empirical survey. Neurocomputing, 469: 28–51.
- Continual learning with filter atom swapping. In ICLR.
- Continual learning with deep generative replay. NeurIPS, 30.
- On transferability of prompt tuning for natural language processing. In NAACL, 3949–3969.
- Graph attention networks. arXiv preprint arXiv:1710.10903.
- S-prompts learning with pre-trained transformers: An occam’s razor for domain incremental learning. NeurIPS, 35: 5682–5695.
- Dualprompt: Complementary prompting for rehearsal-free continual learning. In ECCV, 631–648. Springer.
- Learning to prompt for continual learning. In CVPR, 139–149.
- Continual learning through synaptic intelligence. In ICML, 3987–3995. PMLR.
- PFA-ScanNet: Pyramidal feature aggregation with synergistic learning for breast cancer metastasis analysis. In MICCAI, 586–594. Springer.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.