Single-Shot Wavefront Sensing With Deep Neural Networks For Free-Space Optical Communications

OPTICS EXPRESS(2021)

Cited 10|Views1
No score
Abstract
Applying deep neural networks in image-based wavefront sensing allows for the non-iterative regression of the aberrated phase in real time. In view of the nonlinear mapping from phase to intensity, it is common to utilize two focal plane images in the manner of phase diversity, while algorithms based on only one focal plane image generally yield less accurate estimations. In this paper, we demonstrate that by exploiting a single image of the pupil plane intensity pattern, it is possible to retrieve the wavefront with high accuracy. In the context of free-space optical communications (FSOC), a compact dataset, in which considerable low-order aberrations exist, is generated to train the EfficientNet which learns to regress the Zernike polynomial coefficients from the intensity frame. The performance of ResNet-50 and Inception-V3 are also tested in the same task, which ended up outperformed by EfficientNet by a large margin. To validate the proposed method, the models are fine-tuned and tested with experimental data collected in an adaptive optics platform. (C) 2021 Optical Society of America under the terms of the OSA Open Access Publishing Agreement
More
Translated text
Key words
deep neural networks,single-shot,free-space
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined