Boosting the deep learning wavefront sensor for real-time applications [Invited]

Esteban Vera, Felipe Guzmán, Camilo Weinberger

Research output: Contribution to journalArticlepeer-review

5 Scopus citations


The deep learning wavefront sensor (DLWFS) allows the direct estimate of Zernike coefficients of aberrated wavefronts from intensity images. The main drawback of this approach is related to the use of massive convolutional neural networks (CNNs) that are lengthy to train or estimate. In this paper, we explore several options to reduce both the training and estimation time. First, we develop a CNN that can be rapidly trained without compromising accuracy. Second, we explore the effects given smaller input image sizes and different amounts of Zernike modes to be estimated. Our simulation results demonstrate that the proposed network using images of either 8×8, 16×16, or 32×32 will dramatically reduce training time and even boost the estimation accuracy of Zernike coefficients. From our experimental results, we can confirm that a 16×16 DLWFS can be quickly trained and is able to estimate the first 12 Zernike coefficients-skipping piston, tip, and tilt-without sacrificing accuracy and significantly speeding up the prediction time to facilitate low-cost, real-time adaptive optics systems.

Original languageEnglish
Pages (from-to)B119-B124
JournalApplied Optics
Issue number10
StatePublished - 1 Apr 2021


Dive into the research topics of 'Boosting the deep learning wavefront sensor for real-time applications [Invited]'. Together they form a unique fingerprint.

Cite this