Optimal adaptation strategy for SAM-style models to microscopy

Determine the optimal strategy for adapting Segment Anything family models (e.g., SAM, SAM2, SAM3) to microscopy instance segmentation, comparing approaches such as automatic prompt derivation, training custom decoders on microscopy data, and fine-tuning for promptable segmentation, in order to maximize segmentation accuracy across diverse microscopy modalities.

Background

Segment Anything (SAM) and its successors (SAM2, SAM3) are general-purpose segmentation foundation models primarily trained on natural images. Multiple microscopy-specific adaptations exist that either fine-tune SAM for promptable segmentation, add custom decoders trained on microscopy annotations, or derive automatic prompts. Despite these alternatives, it is unresolved which adaptation pathway yields the best performance for microscopy instance segmentation across varied imaging modalities.

This paper benchmarks several SAM-based microscopy models and proposes Automatic Prompt Generation (APG), underscoring the need to formally establish the most effective adaptation strategy for microscopy.

References

These developments open up the following questions: (i) What is the best strategy for adapting a SAM-style model to microscopy?

Revisiting foundation models for cell instance segmentation  (2603.17845 - Archit et al., 18 Mar 2026) in Section 1, Introduction