Papers
Topics
Authors
Recent
Search
2000 character limit reached

Lensless speckle reconstructive spectrometer via physics-aware neural network

Published 24 Dec 2024 in physics.optics and physics.app-ph | (2412.18238v1)

Abstract: The speckle field yielded by disordered media is extensively employed for spectral measurements. Existing speckle reconstructive spectrometers (RSs) implemented by neural networks primarily rely on supervised learning, which necessitates large-scale spectra-speckle pairs. However, beyond system stability requirements for prolonged data collection, generating diverse spectra with high resolution and finely labeling them is particularly difficult. A lack of variety in datasets hinders the generalization of neural networks to new spectrum types. Here we avoid this limitation by introducing PhyspeNet, an untrained spectrum reconstruction framework combining a convolutional neural network (CNN) with a physical model of a chaotic optical cavity. Without pre-training and prior knowledge about the spectrum under test, PhyspeNet requires only a single captured speckle for various multi-wavelength reconstruction tasks. Experimentally, we demonstrate a lens-free, snapshot RS system by leveraging the one-to-many mapping between spatial and spectrum domains in a random medium. Dual-wavelength peaks separated by 2 pm can be distinguished, and a maximum working bandwidth of 40 nm is achieved with high measurement accuracy. This approach establishes a new paradigm for neural network-based RS systems, entirely eliminating reliance on datasets while ensuring that computational results exhibit a high degree of generalizability and physical explainability.

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.