Papers
Topics
Authors
Recent
Search
2000 character limit reached

VINNA for Neonates -- Orientation Independence through Latent Augmentations

Published 29 Nov 2023 in cs.CV | (2311.17546v1)

Abstract: Fast and accurate segmentation of neonatal brain images is highly desired to better understand and detect changes during development and disease. Yet, the limited availability of ground truth datasets, lack of standardized acquisition protocols, and wide variations of head positioning pose challenges for method development. A few automated image analysis pipelines exist for newborn brain MRI segmentation, but they often rely on time-consuming procedures and require resampling to a common resolution, subject to loss of information due to interpolation and down-sampling. Without registration and image resampling, variations with respect to head positions and voxel resolutions have to be addressed differently. In deep-learning, external augmentations are traditionally used to artificially expand the representation of spatial variability, increasing the training dataset size and robustness. However, these transformations in the image space still require resampling, reducing accuracy specifically in the context of label interpolation. We recently introduced the concept of resolution-independence with the Voxel-size Independent Neural Network framework, VINN. Here, we extend this concept by additionally shifting all rigid-transforms into the network architecture with a four degree of freedom (4-DOF) transform module, enabling resolution-aware internal augmentations (VINNA). In this work we show that VINNA (i) significantly outperforms state-of-the-art external augmentation approaches, (ii) effectively addresses the head variations present specifically in newborn datasets, and (iii) retains high segmentation accuracy across a range of resolutions (0.5-1.0 mm). The 4-DOF transform module is a powerful, general approach to implement spatial augmentation without requiring image or label interpolation. The specific network application to newborns will be made publicly available as VINNA4neonates.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (68)
  1. doi:10.1016/S0140-6736(00)02761-6. URL https://linkinghub.elsevier.com/retrieve/pii/S0140673600027616
  2. doi:10.1016/j.media.2005.05.007. URL https://linkinghub.elsevier.com/retrieve/pii/S1361841505000630
  3. doi:10.1016/j.neuroscience.2013.12.044. URL https://linkinghub.elsevier.com/retrieve/pii/S0306452213010695
  4. doi:10.1093/cercor/bhr327. URL https://academic.oup.com/cercor/article-lookup/doi/10.1093/cercor/bhr327
  5. doi:https://doi.org/10.1016/j.neuroimage.2022.118933. URL https://www.sciencedirect.com/science/article/pii/S1053811922000623
  6. doi:10.3389/fnins.2021.666020.
  7. doi:https://doi.org/10.1016/j.neuroimage.2018.03.005. URL https://www.sciencedirect.com/science/article/pii/S1053811918301903
  8. doi:https://doi.org/10.1002/jmri.27192. URL https://onlinelibrary.wiley.com/doi/abs/10.1002/jmri.27192
  9. arXiv:https://analyticalsciencejournals.onlinelibrary.wiley.com/doi/pdf/10.1002/nbm.4103, doi:https://doi.org/10.1002/nbm.4103. URL https://analyticalsciencejournals.onlinelibrary.wiley.com/doi/abs/10.1002/nbm.4103
  10. doi:https://doi.org/10.1016/j.neuroimage.2005.09.068. URL https://www.sciencedirect.com/science/article/pii/S105381190500710X
  11. doi:https://doi.org/10.1016/j.neuroimage.2018.03.049. URL https://www.sciencedirect.com/science/article/pii/S1053811918302593
  12. doi:https://doi.org/10.1016/j.neuroimage.2018.01.054. URL https://www.sciencedirect.com/science/article/pii/S1053811918300545
  13. doi:https://doi.org/10.1016/j.neuroimage.2018.05.064. URL https://www.sciencedirect.com/science/article/pii/S1053811918304889
  14. arXiv:https://jamanetwork.com/journals/jamapsychiatry/articlepdf/2774000/jamapsychiatry_volkow_2020_vp_200048_1619663118.80156.pdf, doi:10.1001/jamapsychiatry.2020.3803. URL https://doi.org/10.1001/jamapsychiatry.2020.3803
  15. doi:10.1371/journal.pone.0018746. URL https://doi.org/10.1371/journal.pone.0018746
  16. doi:https://doi.org/10.1016/j.neuroimage.2010.10.019.
  17. doi:https://doi.org/10.1016/j.neuroimage.2011.09.062.
  18. doi:https://doi.org/10.1016/j.neuroimage.2012.05.083. URL https://www.sciencedirect.com/science/article/pii/S1053811912005915
  19. doi:https://doi.org/10.1016/j.neuroimage.2015.10.047. URL https://www.sciencedirect.com/science/article/pii/S1053811915009581
  20. doi:10.1007/s12021-012-9164-z. URL http://link.springer.com/10.1007/s12021-012-9164-z
  21. doi:10.1117/12.2082209. URL http://proceedings.spiedigitallibrary.org/proceeding.aspx?doi=10.1117/12.2082209
  22. doi:10.1117/12.2513237. URL https://www.spiedigitallibrary.org/conference-proceedings-of-spie/10953/2513237/Multiseg-pipeline--automatic-tissue-segmentation-of-brain-MR-images/10.1117/12.2513237.full
  23. doi:https://doi.org/10.1016/j.neuroimage.2020.116946. URL https://www.sciencedirect.com/science/article/pii/S1053811920304328
  24. doi:10.1016/j.neuroimage.2010.02.025. URL https://linkinghub.elsevier.com/retrieve/pii/S105381191000193X
  25. doi:10.1109/TMI.2014.2322280.
  26. doi:https://doi.org/10.1016/j.media.2019.101540. URL https://www.sciencedirect.com/science/article/pii/S1361841519300763
  27. doi:10.1007/s00429-018-1735-9. URL http://link.springer.com/10.1007/s00429-018-1735-9
  28. doi:10.1073/pnas.1715451115.
  29. doi:https://doi.org/10.1016/j.neuroimage.2018.06.018.
  30. doi:10.1038/s41596-023-00806-x. URL https://www.nature.com/articles/s41596-023-00806-x
  31. doi:10.1016/j.compmedimag.2019.101660. URL https://linkinghub.elsevier.com/retrieve/pii/S0895611119300771
  32. doi:10.1016/j.future.2019.11.021. URL https://linkinghub.elsevier.com/retrieve/pii/S0167739X18332291
  33. doi:10.1109/ISBI.2016.7493515. URL https://ieeexplore.ieee.org/document/7493515/
  34. doi:10.1016/j.neuroimage.2014.12.061. URL https://linkinghub.elsevier.com/retrieve/pii/S1053811914010660
  35. doi:10.1002/hbm.26174. URL https://onlinelibrary.wiley.com/doi/10.1002/hbm.26174
  36. doi:10.1109/ISBI.2018.8363540.
  37. doi:10.1109/ISBI.2018.8363542.
  38. doi:10.1007/978-3-030-33391-1_28. URL http://link.springer.com/10.1007/978-3-030-33391-1_28
  39. doi:10.1016/j.neuroimage.2015.06.007. URL https://linkinghub.elsevier.com/retrieve/pii/S1053811915004838
  40. doi:10.1186/s40708-022-00161-9. URL https://braininformatics.springeropen.com/articles/10.1186/s40708-022-00161-9
  41. doi:10.1007/978-3-319-24574-4_28. URL https://link.springer.com/chapter/10.1007%2F978-3-319-24574-4_28
  42. doi:10.1109/42.906424.
  43. doi:10.3389/fninf.2019.00067. URL https://www.frontiersin.org/article/10.3389/fninf.2019.00067
  44. doi:https://doi.org/10.1016/j.neuroimage.2020.117026. URL https://www.sciencedirect.com/science/article/pii/S1053811920305127
  45. doi:10.1007/978-3-030-59728-3_18. URL https://doi.org/10.1007%2F978-3-030-59728-3_18
  46. doi:https://doi.org/10.1016/j.neuroimage.2021.118206.
  47. doi:10.1016/j.neuroimage.2020.117012.
  48. doi:10.3389/fninf.2016.00012. URL http://journal.frontiersin.org/Article/10.3389/fninf.2016.00012/abstract
  49. doi:10.3389/fnhum.2015.00021. URL https://www.frontiersin.org/articles/10.3389/fnhum.2015.00021
  50. doi:10.1002/mrm.26462. URL https://onlinelibrary.wiley.com/doi/10.1002/mrm.26462
  51. doi:10.1109/TMI.2016.2621185.
  52. doi:10.1007/s00247-023-05620-x.
  53. doi:10.1109/TFUZZ.2021.3052461.
  54. arXiv:https://ietresearch.onlinelibrary.wiley.com/doi/pdf/10.1049/iet-ipr.2020.0469, doi:https://doi.org/10.1049/iet-ipr.2020.0469. URL https://ietresearch.onlinelibrary.wiley.com/doi/abs/10.1049/iet-ipr.2020.0469
  55. doi:10.1109/TMI.2019.2901712. URL https://ieeexplore.ieee.org/document/8654000/
  56. doi:10.1109/TMI.2021.3055428. URL https://ieeexplore.ieee.org/document/9339962/
  57. doi:10.1038/s41592-020-01008-z. URL http://www.nature.com/articles/s41592-020-01008-z
  58. doi:10.1002/mrm.26796. URL https://onlinelibrary.wiley.com/doi/10.1002/mrm.26796
  59. doi:https://doi.org/10.1016/j.media.2012.07.004. URL https://www.sciencedirect.com/science/article/pii/S1361841512000965
  60. doi:10.3389/fnins.2019.00034. URL https://www.frontiersin.org/articles/10.3389/fnins.2019.00034
  61. doi:10.2307/1932409. URL http://doi.wiley.com/10.2307/1932409
  62. T. J. Sørensen, A method of establishing groups of equal amplitude in plant sociology based on similarity of species and its application to analyses of the vegetation on Danish commons 5 (1948) 1–34.
  63. arXiv:https://rss.onlinelibrary.wiley.com/doi/pdf/10.1111/j.2517-6161.1995.tb02031.x, doi:https://doi.org/10.1111/j.2517-6161.1995.tb02031.x. URL https://rss.onlinelibrary.wiley.com/doi/abs/10.1111/j.2517-6161.1995.tb02031.x
  64. arXiv:https://www.pnas.org/doi/pdf/10.1073/pnas.1117943109, doi:10.1073/pnas.1117943109. URL https://www.pnas.org/doi/abs/10.1073/pnas.1117943109
  65. doi:10.1109/ICCV.2017.97.
  66. doi:https://doi.org/10.1016/j.neuroimage.2012.02.084. URL https://www.sciencedirect.com/science/article/pii/S1053811912002765
  67. doi:https://doi.org/10.1016/j.neuroimage.2015.04.057.
  68. doi:10.1038/s41386-020-0736-6. URL https://www.nature.com/articles/s41386-020-0736-6
Citations (2)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.