Papers
Topics
Authors
Recent
Search
2000 character limit reached

Re-Interpreting the Step-Response Probability Curve to Extract Fundamental Physical Parameters of Event-based Vision Sensors

Published 11 Apr 2024 in eess.IV | (2404.07656v1)

Abstract: Biologically inspired event-based vision sensors (EVS) are growing in popularity due to performance benefits including ultra-low power consumption, high dynamic range, data sparsity, and fast temporal response. They efficiently encode dynamic information from a visual scene through pixels that respond autonomously and asynchronously when the per-pixel illumination level changes by a user-selectable contrast threshold ratio, $\theta$. Due to their unique sensing paradigm and complex analog pixel circuitry, characterizing Event-based Vision Sensor (EVS) is non-trivial. The step-response probability curve (S-curve) is a key measurement technique that has emerged as the standard for measuring $\theta$. In this work, we detail the method for generating accurate S-curves by applying an appropriate stimulus and sensor configuration to decouple 2nd-order effects from the parameter being studied. We use an EVS pixel simulation to demonstrate how noise and other physical constraints can lead to error in the measurement, and develop two techniques that are robust enough to obtain accurate estimates. We then apply best practices derived from our simulation to generate S-curves for the latest generation Sony IMX636 and interpret the resulting family of curves to correct the apparent anomalous result of previous reports suggesting that $\theta$ changes with illumination. Further, we demonstrate that with correct interpretation, fundamental physical parameters such as dark current and RMS noise can be accurately inferred from a collection of S-curves, leading to more accurate parameterization for high-fidelity EVS simulations.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (11)
  1. Gallego, G., Delbruck, T., Orchard, G., Bartolozzi, C., Taba, B., Censi, A., Leutenegger, S., Davison, A. J., Conradt, J., Daniilidis, K., and Scaramuzza, D., “Event-based vision: A survey,” IEEE Trans. Pattern Anal. Mach. Intell. 44, 154–180 (Jan. 2022).
  2. Posch, C. and Matolin, D., “Sensitivity and uniformity of a 0.18µm CMOS temporal contrast pixel array,” in [2011 IEEE International Symposium of Circuits and Systems (ISCAS) ], 1572–1575 (May 2011).
  3. Finateu, T., Niwa, A., Matolin, D., Tsuchimoto, K., Mascheroni, A., Reynaud, E., Mostafalu, P., Brady, F., Chotard, L., LeGoff, F., Takahashi, H., Wakabayashi, H., Oike, Y., and Posch, C., “A 1280×\times×720 Back-Illuminated stacked temporal contrast Event-Based vision sensor with 4.86µm pixels, 1.066GEPS readout, programmable Event-Rate controller and compressive Data-Formatting pipeline,” (2020).
  4. Niwa, A., Mochizuki, F., Berner, R., Maruyarma, T., Terano, T., Takamiya, K., Kimura, Y., Mizoguchi, K., Miyazaki, T., Kaizu, S., Takahashi, H., Suzuki, A., Brandli, C., Wakabayashi, H., and Oike, Y., “A 2.97μ𝜇\muitalic_μm-Pitch Event-Based vision sensor with shared pixel Front-End circuitry and Low-Noise intensity readout mode,” in [2023 IEEE International Solid- State Circuits Conference (ISSCC) ], 4–6 (Feb. 2023).
  5. Guo, M., Chen, S., Gao, Z., Yang, W., Bartkovjak, P., Qin, Q., Hu, X., Zhou, D., Uchiyama, M., Kudo, Y., Fukuoka, S., Xu, C., Ebihara, H., Wang, A., Jiang, P., Jiang, B., Mu, B., Chen, H., Yang, J., Dai, T. J., and Suess, A., “A 3-Wafer-Stacked hybrid 15MPixel CIS + 1 MPixel EVS with 4.6GEvent/s readout, In-Pixel TDC and On-Chip ISP and ESP function,” in [2023 IEEE International Solid-State Circuits Conference (ISSCC) ], 90–92, IEEE (Feb. 2023).
  6. Lichtsteiner, P., Posch, C., and Delbruck, T., “A 128 ×\times× 128 120 db 15 μ𝜇\muitalic_μs latency asynchronous temporal contrast vision sensor,” IEEE J. Solid-State Circuits 43(2), 566–576 (2008).
  7. Graça, R., McReynolds, B., and Delbruck, T., “Shining light on the DVS pixel: A tutorial and discussion about biasing and optimization,” arXiv preprint arXiv:2304.04706 (2023).
  8. McReynolds, B., Graca, R., and Delbruck, T., “Experimental methods to predict dynamic vision sensor event camera performance,” Opt. Eng. (2022).
  9. Hu, Y., Liu, S.-C., and Delbruck, T., “v2e: From video frames to realistic DVS events,” in [Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition ], 1312–1321 (2021).
  10. Graca, R., McReynolds, B., and Delbruck, T., “Optimal biasing and physical limits of DVS event noise,” arXiv preprint arXiv:2304.04019 (2023).
  11. Nozaki, Y. and Delbruck, T., “Temperature and parasitic photocurrent effects in dynamic vision sensors,” IEEE Trans. Electron Devices 64, 3239–3245 (Aug. 2017).
Citations (1)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.