A physics-informed deep learning liquid crystal camera with data-driven diffractive guidance

Uloženo v:
Podrobná bibliografie
Vydáno v:Communications Engineering vol. 3, no. 1 (Dec 2024), p. 46
Hlavní autor: Shi, Jiashuo
Další autoři: Liu, Taige, Zhou, Liang, Yan, Pei, Wang, Zhe, Zhang, Xinyu
Vydáno:
Springer Nature B.V.
Témata:
On-line přístup:Citation/Abstract
Full Text
Full Text - PDF
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Abstrakt:Whether in the realms of computer vision, robotics, or environmental monitoring, the ability to monitor and follow specific targets amidst intricate surroundings is essential for numerous applications. However, achieving rapid and efficient target tracking remains a challenge. Here we propose an optical implementation for rapid tracking with negligible digital post-processing, leveraging an all-optical information processing. This work combines a diffractive-based optical nerual network with a layered liquid crystal electrical addressing architecture, synergizing the parallel processing capabilities inherent in light propagation with liquid crystal dynamic adaptation mechanism. Through a one-time effort training, the trained network enable accurate prediction of the desired arrangement of liquid crystal molecules as confirmed through numerical blind testing. Then we establish an experimental camera architecture that synergistically combines an electrically-tuned functioned liquid crystal layer with materialized optical neural network. With integrating the architecture into optical imaging path of a detector plane, this optical computing camera offers a data-driven diffractive guidance, enabling the identification of target within complex backgrounds, highlighting its high-level vision task implementation and problem-solving capabilities.Jiashuo Shi and colleagues build an integrated camera capable of tracking objects of interest. They use optical computing to arrange molecules in the liquid crystal mask for enhanced distinction between the object and background.
ISSN:2731-3395
DOI:10.1038/s44172-024-00191-7
Zdroj:Science Database