Papers
Topics
Authors
Recent
Search
2000 character limit reached

EPiC-ly Fast Particle Cloud Generation with Flow-Matching and Diffusion

Published 29 Sep 2023 in hep-ph and cs.LG | (2310.00049v1)

Abstract: Jets at the LHC, typically consisting of a large number of highly correlated particles, are a fascinating laboratory for deep generative modeling. In this paper, we present two novel methods that generate LHC jets as point clouds efficiently and accurately. We introduce \epcjedi, which combines score-matching diffusion models with the Equivariant Point Cloud (EPiC) architecture based on the deep sets framework. This model offers a much faster alternative to previous transformer-based diffusion models without reducing the quality of the generated jets. In addition, we introduce \epcfm, the first permutation equivariant continuous normalizing flow (CNF) for particle cloud generation. This model is trained with {\it flow-matching}, a scalable and easy-to-train objective based on optimal transport that directly regresses the vector fields connecting the Gaussian noise prior to the data distribution. Our experiments demonstrate that \epcjedi and \epcfm both achieve state-of-the-art performance on the top-quark JetNet datasets whilst maintaining fast generation speed. Most notably, we find that the \epcfm model consistently outperforms all the other generative models considered here across every metric. Finally, we also introduce two new particle cloud performance metrics: the first based on the Kullback-Leibler divergence between feature distributions, the second is the negative log-posterior of a multi-model ParticleNet classifier.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (93)
  1. “LHC Machine” In JINST 3, 2008, pp. S08001 DOI: 10.1088/1748-0221/3/08/S08001
  2. The ATLAS Collaboration “The ATLAS Experiment at the CERN Large Hadron Collider” In JINST 3, 2008, pp. S08003 DOI: 10.1088/1748-0221/3/08/S08003
  3. The CMS Collaboration “The CMS experiment at the CERN LHC” In JINST 3, 2008, pp. S08004 DOI: 10.1088/1748-0221/3/08/S08004
  4. Luke Oliveira “Jet-images — deep learning edition” In JHEP 07, 2016, pp. 069 DOI: 10.1007/jhep07(2016)069
  5. Michela Paganini, Luke Oliveira and Benjamin Nachman “CaloGAN : Simulating 3D high energy particle showers in multilayer electromagnetic calorimeters with generative adversarial networks” In Phys. Rev. D 97.1, 2018, pp. 014021 DOI: 10.1103/PhysRevD.97.014021
  6. Michela Paganini, Luke Oliveira and Benjamin Nachman “Accelerating Science with Generative Adversarial Networks: An Application to 3D Particle Showers in Multilayer Calorimeters” In Phys. Rev. Lett. 120.4, 2018, pp. 042003 DOI: 10.1103/PhysRevLett.120.042003
  7. Martin Erdmann, Jonas Glombitza and Thorben Quast “Precise simulation of electromagnetic calorimeter showers using a Wasserstein Generative Adversarial Network” In Comput. Softw. Big Sci. 3.1, 2019, pp. 4 DOI: 10.1007/s41781-018-0019-7
  8. Dawit Belayneh “Calorimetry with deep learning: particle simulation and reconstruction for collider physics” In Eur. Phys. J. C 80.7, 2020, pp. 688 DOI: 10.1140/epjc/s10052-020-8251-9
  9. Erik Buhmann “Getting High: High Fidelity Simulation of High Granularity Calorimeters with High Speed” In Comput. Softw. Big Sci. 5.1, 2021, pp. 13 DOI: 10.1007/s41781-021-00056-0
  10. “CaloFlow: Fast and Accurate Generation of Calorimeter Showers with Normalizing Flows”, 2021 DOI: 10.48550/arxiv.2106.05285
  11. “CaloFlow II: Even Faster and Still Accurate Generation of Calorimeter Showers with Normalizing Flows”, 2021 DOI: 10.48550/arxiv.2110.11377
  12. The ATLAS Collaboration “AtlFast3: the next generation of fast simulation in ATLAS” In Comput. Softw. Big Sci. 6, 2022, pp. 7 DOI: 10.1007/s41781-021-00079-7
  13. The ATLAS Collaboration “Deep generative models for fast photon shower simulation in ATLAS”, 2022 DOI: 10.48550/arxiv.2210.06204
  14. Andreas Adelmann “New directions for surrogate models and differentiable programming for High Energy Physics detector simulation” In 2022 Snowmass Summer Study, 2022 DOI: 10.48550/arxiv.2203.08806
  15. Claudius Krause, Ian Pang and David Shih “CaloFlow for CaloChallenge Dataset 1”, 2022 DOI: 10.48550/arxiv.2210.14245
  16. “CaloDVAE : Discrete Variational Autoencoders for Fast Calorimeter Shower Simulation”, 2022 DOI: 10.48550/arxiv.2210.07430
  17. “Generalizing to new calorimeter geometries with Geometry-Aware Autoregressive Models (GAAMs) for fast calorimeter simulation”, 2023 DOI: 10.48550/arxiv.2305.11531
  18. “L2LFlows : Generating High-Fidelity 3D Calorimeter Images”, 2023 DOI: 10.48550/arxiv.2302.11594
  19. “New Angles on Fast Calorimeter Shower Simulation”, 2023 DOI: 10.48550/arxiv.2303.18150
  20. “Inductive CaloFlow”, 2023 DOI: 10.48550/arxiv.2305.11934
  21. “Ultra-High-Resolution Detector Simulation with Intra-Event Aware GAN and Self-Supervised Relational Reasoning”, 2023 DOI: 10.48550/arxiv.2303.08046
  22. “Machine Learning methods for simulating particle response in the Zero Degree Calorimeter at the ALICE experiment, CERN”, 2023 DOI: 10.48550/arxiv.2306.13606
  23. “Comparison of Point Cloud and Image-based Models for Calorimeter Fast Simulation”, 2023 DOI: 10.48550/arxiv.2307.04780
  24. “CaloDiffusion with GLaM for High Fidelity Calorimeter Simulation”, 2023 arXiv:2308.03876 [physics.ins-det]
  25. “SR-GAN for SR-gamma: photon super resolution at collider experiments”, 2023 arXiv:2308.09025 [hep-ex]
  26. Ian Pang, John Andrew Raine and David Shih “SuperCalo: Calorimeter shower super-resolution”, 2023 arXiv:2308.11700 [physics.ins-det]
  27. “Generative Machine Learning for Detector Response Modeling with a Conditional Normalizing Flow”, 2023 arXiv:2303.10148 [hep-ex]
  28. Simon Schnake, Dirk Krücker and Kerstin Borras “Generating Calorimeter Showers as Point Clouds” In Machine Learning and the Physical Sciences, Workshop at the 36th conference on Neural Information Processing Systems (NeurIPS), 2022 URL: https://ml4physicalsciences.github.io/2022/files/NeurIPS˙ML4PS˙2022˙77.pdf
  29. Sydney Otten “Event Generation and Statistical Sampling for Physics with Deep Generative Models and a Density Information Buffer” In Nature Commun. 12.1, 2021, pp. 2985 DOI: 10.1038/s41467-021-22616-z
  30. Bobak Hashemi “LHC analysis-specific datasets with Generative Adversarial Networks”, 2019 DOI: 10.48550/arxiv.1901.05282
  31. Riccardo Di Sipio “DijetGAN: A Generative-Adversarial Network Approach for the Simulation of QCD Dijet Events at the LHC” In JHEP 08, 2019, pp. 110 DOI: 10.1007/JHEP08(2019)110
  32. Anja Butter, Tilman Plehn and Ramon Winterhalder “How to GAN LHC Events” In SciPost Phys. 7.6, 2019, pp. 075 DOI: 10.21468/SciPostPhys.7.6.075
  33. Jesús Arjona Martinez “Particle Generative Adversarial Networks for full-event simulation at the LHC and their application to pileup description” In J. Phys. Conf. Ser. 1525.1, 2020, pp. 012081 DOI: 10.1088/1742-6596/1525/1/012081
  34. Christina Gao “Event Generation with Normalizing Flows” In Phys. Rev. D 101.7, 2020, pp. 076002 DOI: 10.1103/PhysRevD.101.076002
  35. Yasir Alanazi “Simulation of electron-proton scattering events by a Feature-Augmented and Transformed Generative Adversarial Network (FAT-GAN)”, 2020 DOI: 10.24963/ijcai.2021/293
  36. Marco Bellagente “Invertible Networks or Partons to Detector and Back Again” In SciPost Phys. 9, 2020, pp. 074 DOI: 10.21468/SciPostPhys.9.5.074
  37. Luisa Velasco “cFAT-GAN: Conditional Simulation of Electron-Proton Scattering Events with Variate Beam Energies by a Feature Augmented and Transformed Generative Adversarial Network” In 19th IEEE International Conference on Machine Learning and Applications, 2020, pp. 372–375 DOI: 10.1109/icmla51294.2020.00066
  38. “Generative Networks for LHC events”, 2020 DOI: 10.48550/arxiv.2008.08558
  39. Jessica N. Howard “Learning to simulate high energy particle collisions from unlabeled data” In Sci. Rep. 12, 2022, pp. 7567 DOI: 10.1038/s41598-022-10966-7
  40. Guillaume Quétant “Turbo-Sim: a generalised generative model with a physical latent space”, 2021 DOI: 10.48550/arxiv.2112.10629
  41. Anja Butter “GANplifying event samples” In SciPost Phys. 10.6, 2021, pp. 139 DOI: 10.21468/SciPostPhys.10.6.139
  42. “Calomplification – The Power of Generative Calorimeter Models” In JINST 17.09, 2022, pp. P09028 DOI: 10.1088/1748-0221/17/09/P09028
  43. Joshua Lin, Wahid Bhimji and Benjamin Nachman “Machine Learning Templates for QCD Factorization in the Search for Physics Beyond the Standard Model” In JHEP 05, 2019, pp. 181 DOI: 10.1007/JHEP05(2019)181
  44. “Flow Away your Differences: Conditional Normalizing Flows as an Improvement to Reweighting”, 2023 DOI: 10.48550/arxiv.2304.14963
  45. Raghav Kansal “Particle Cloud Generation with Message Passing Generative Adversarial Networks” In Proceedings of Advances in Neural Information Processing Systems 34, 2021, pp. 23858–23871 DOI: 10.48550/arxiv.2106.11535
  46. Erik Buhmann, Gregor Kasieczka and Jesse Thaler “EPiC-GAN: Equivariant Point Cloud Generation for Particle Jets”, 2023 DOI: 10.48550/arxiv.2301.08128
  47. “PC-JeDi: Diffusion for Particle Cloud Generation in High Energy Physics”, 2023 DOI: 10.48550/arxiv.2303.05376
  48. Vinicius Mikuni, Benjamin Nachman and Mariel Pettee “Fast Point Cloud Generation with Diffusion Models in High Energy Physics”, 2023 DOI: 10.48550/arxiv.2304.01266
  49. “Attention to Mean-Fields for Particle Cloud Generation”, 2023 arXiv:2305.15254 [hep-ex]
  50. “Jet Diffusion versus JetGPT – Modern Networks for the LHC”, 2023 DOI: 10.48550/arxiv.2305.10475
  51. “CaloClouds: Fast Geometry-Independent Highly-Granular Calorimeter Simulation”, 2023 DOI: 10.48550/arxiv.2305.04847
  52. “PC-Droid: Faster diffusion and improved quality for particle cloud generation”, 2023 DOI: 10.48550/arxiv.2307.06836
  53. “CaloClouds II: Ultra-Fast Geometry-Independent Highly-Granular Calorimeter Simulation”, 2023 arXiv:2309.05704 [physics.ins-det]
  54. “Deep Unsupervised Learning using Nonequilibrium Thermodynamics”, 2015 arXiv:1503.03585 [cs.LG]
  55. “Generative Modeling by Estimating Gradients of the Data Distribution”, 2020 arXiv:1907.05600 [cs.LG]
  56. “Improved Techniques for Training Score-Based Generative Models”, 2020 arXiv:2006.09011 [cs.LG]
  57. Jonathan Ho, Ajay Jain and Pieter Abbeel “Denoising Diffusion Probabilistic Models”, 2020 arXiv:2006.11239 [cs.LG]
  58. “Score-Based Generative Modeling through Stochastic Differential Equations”, 2021 arXiv:2011.13456 [cs.LG]
  59. “JetFlow: Generating Jets with Conditioned and Mass Constrained Normalising Flows”, 2022 DOI: 10.48550/arxiv.2211.13630
  60. “Neural Ordinary Differential Equations”, 2019 DOI: 10.48550/arxiv.1806.07366
  61. “E(n) Equivariant Normalizing Flows” In Advances in Neural Information Processing Systems, 2021
  62. “Pointflow: 3d point cloud generation with continuous normalizing flows” In Proceedings of the IEEE/CVF international conference on computer vision, 2019, pp. 4541–4550
  63. Xingchao Liu, Chengyue Gong and Qiang Liu “Flow Straight and Fast: Learning to Generate and Transfer Data with Rectified Flow”, 2022 DOI: 10.48550/arxiv.2209.03003
  64. Michael S. Albergo and Eric Vanden-Eijnden “Building Normalizing Flows with Stochastic Interpolants”, 2023 DOI: 10.48550/arxiv.2209.15571
  65. “Flow Matching for Generative Modeling”, 2023 DOI: 10.48550/arxiv.2210.02747
  66. Yang Song “Score-Based Generative Modeling through Stochastic Differential Equations” In Proceedings of the International Conference on Learning Representations, 2021 DOI: 10.48550/arxiv.2011.13456
  67. Alexander Quinn Nichol and Prafulla Dhariwal “Improved Denoising Diffusion Probabilistic Models” In Proceedings of the 38th International Conference on Machine Learning 139, 2021, pp. 8162–8171 DOI: 10.48550/arxiv.2102.09672
  68. Peter E. Kloeden and Eckhard Platen “Numerical Solution of Stochastic Differential Equations” Berlin, Germany: Springer Berlin, 1992 DOI: 10.1007/978-3-662-12616-5
  69. Esteban G Tabak and Eric Vanden-Eijnden “Density estimation by dual ascent of the log-likelihood” In Commun. Math. Sci. 8.1, 2010, pp. 217–233 DOI: 10.4310/CMS.2010.v8.n1.a11
  70. “Variational Inference with Normalizing flows” In Proceedings of the 32nd International Conference on Machine Learning 37, PMLR, 2015, pp. 1530–1538 arXiv:1505.05770 [stat.ML]
  71. Lynton Ardizzone “Guided Image Generation with Conditional Invertible Neural Networks”, 2019 arXiv:1907.02392 [cs.CV]
  72. Conor Durkan “Neural Spline Flows” In Proceedings of Advances in Neural Information Processing Systems 32, 2019 arXiv:1906.04032 [stat.ML]
  73. Conor Durkan “nflows: normalizing flows in PyTorch” Zenodo, 2020 DOI: 10.5281/zenodo.4296287
  74. “Factoring Variations in Natural Images with Deep Gaussian Mixture Models” In Advances in Neural Information Processing Systems 27 Curran Associates, Inc., 2014
  75. Raghav Kansal “JetNet” Zenodo, 2022 DOI: 10.5281/zenodo.6975118
  76. Raghav Kansal “JetNet150” Zenodo, 2022 DOI: 10.5281/zenodo.6302240
  77. “The importance of calorimetry for highly-boosted jet substructure” In JINST 13.01, 2018, pp. T01003 DOI: 10.1088/1748-0221/13/01/T01003
  78. “HLS4ML LHC Jet dataset (30 particles)” Zenodo, 2020 DOI: 10.5281/zenodo.3601436
  79. “The automated computation of tree-level and next-to-leading order differential cross sections, and their matching to parton shower simulations” In JHEP 07, 2014, pp. 079 DOI: 10.1007/JHEP07(2014)079
  80. Torbjörn Sjöstrand, Stephen Mrenna and Peter Skands “A brief introduction to PYTHIA 8.1” In Comput. Phys. Commun. 178, 2008, pp. 852–867 DOI: 10.1016/j.cpc.2008.01.036
  81. Peter Skands, Stefano Carrazza and Juan Rojo “Tuning PYTHIA 8.1: the Monash 2013 Tune” In Eur. Phys. J. C 74.8, 2014, pp. 3024 DOI: 10.1140/epjc/s10052-014-3024-y
  82. Matteo Cacciari, Gavin P Salam and Gregory Soyez “FastJet User Manual” In Eur. Phys. J. C 72.3, 2012, pp. 1896 DOI: 10.1140/epjc/s10052-012-1896-2
  83. Matteo Cacciari, Gavin P Salam and Gregory Soyez “The anti-kt jet clustering algorithm” In JHEP 04, 2008, pp. 063 DOI: 10.1088/1126-6708/2008/04/063
  84. Jesse Thaler and Ken Van Tilburg “Identifying boosted objects with N-subjettiness” In JHEP 03, 2011, pp. 015 DOI: 10.1007/jhep03(2011)015
  85. Andrew J. Larkoski, Gavin P. Salam and Jesse Thaler “Energy correlation functions for jet substructure” In JHEP 06, 2013, pp. 108 DOI: 10.1007/jhep06(2013)108
  86. Patrick T. Komiske, Eric M. Metodiev and Jesse Thaler “Energy flow polynomials: a complete linear basis for jet substructure” In JHEP 04, 2018, pp. 013 DOI: 10.1007/jhep04(2018)013
  87. “GalaxyFlow: Upsampling Hydrodynamical Simulations for Realistic Gaia Mock Catalogs”, 2022 arXiv:2211.11765
  88. “Jet tagging via particle clouds” In Phys. Rev. D 101, 2020, pp. 056019 DOI: 10.1103/PhysRevD.101.056019
  89. “Progressive Distillation for Fast Sampling of Diffusion Models”, 2022 arXiv:2202.00512 [cs.LG]
  90. “Consistency Models”, 2023 arXiv:2303.01469 [cs.LG]
  91. “Decoupled Weight Decay Regularization”, 2019 DOI: 10.48550/arXiv.1711.05101
  92. Diederik P Kingma and Jimmy Ba “Adam: A Method for Stochastic Optimization” In Conference Track Proceedings of the 3rd International Conference on Learning Representations, 2015 DOI: 10.48550/arxiv.1412.6980
  93. Raghav Kansal “On the Evaluation of Generative Models in High Energy Physics”, 2022 DOI: 10.48550/arxiv.2211.10295
Citations (22)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.