Deep residual learning for low-order wavefront sensing in high-contrast imaging systems

Gregory Allan, Iksung Kang, Ewan S. Douglas, George Barbastathis, Kerri Cahoy

Research output: Contribution to journalArticlepeer-review

15 Scopus citations


Sensing and correction of low-order wavefront aberrations is critical for high-contrast astronomical imaging. State of the art coronagraph systems typically use image-based sensing methods that exploit the rejected on-axis light, such as Lyot-based low order wavefront sensors (LLOWFS); these methods rely on linear least-squares fitting to recover Zernike basis coefficients from intensity data. However, the dynamic range of linear recovery is limited. We propose the use of deep neural networks with residual learning techniques for non-linear wavefront sensing. The deep residual learning approach extends the usable range of the LLOWFS sensor by more than an order of magnitude compared to the conventional methods, and can improve closed-loop control of systems with large initial wavefront error. We demonstrate that the deep learning approach performs well even in low-photon regimes common to coronagraphic imaging of exoplanets.

Original languageEnglish (US)
Pages (from-to)26267-26283
Number of pages17
JournalOptics Express
Issue number18
StatePublished - Aug 31 2020

ASJC Scopus subject areas

  • Atomic and Molecular Physics, and Optics


Dive into the research topics of 'Deep residual learning for low-order wavefront sensing in high-contrast imaging systems'. Together they form a unique fingerprint.

Cite this