Computational wavefront sensing without a wavefront sensor

Computational wavefront sensing eliminates the need for dedicated wavefront sensing hardware and maintains accurate wavefront estimation—offering a flexible alternative architecture for optical metrology, adaptive optics, and alignment applications.

Wavefront sensing is a critical component of modern optical metrology, adaptive optics, and optical alignment systems. Conventional approaches such as Shack-Hartmann sensors and interferometric systems provide high-performance wavefront measurements, but often require dedicated hardware, careful alignment, and increased system complexity. As optical systems continue to demand faster measurement rates and more flexible deployment, computational approaches to wavefront sensing are becoming increasingly attractive.

Our image-space wavefront sensing approach is based on machine learning (ML)-driven reconstruction from a defocused image, because we demonstrated that accurate wavefront estimation can be achieved using standard imaging hardware to reduce hardware complexity and enable real-time operation. Unlike conventional architectures that require dedicated wavefront sensing hardware, our method operates directly from standard imaging data and allows wavefront reconstruction through computational analysis alone. This single-image wavefront sensing approach simplifies optical integration and maintains practical wavefront reconstruction capability.

First, our approach acquires a single defocused image of a source illuminating the optic under test. Then, a ML-based reconstruction pipeline estimates the wavefront directly from the image-space information and generates outputs such as Zernike coefficients, 2D and 3D wavefront maps, point spread function analysis, modulation transfer function (MTF) estimation, and optical aberration metrics.

Since it operates directly from imaging data, our approach can be integrated into a wide variety of optical configurations without the need for dedicated wavefront sensing hardware (see Fig. 1).

Benchmarked against established methods

To evaluate accuracy and repeatability, we benchmarked the method against interferometric and Shack-Hartmann wavefront sensing systems.1

The comparisons demonstrated strong agreement between the reconstructed wavefronts and measurements obtained using conventional interferometric and Shack-Hartmann metrology systems. Beyond matching key aberration terms, Zernike coefficients, and overall wavefront structure, our computational image-space approach demonstrated stable and repeatable reconstruction performance across varying optical conditions using only standard imaging hardware.

Similar comparisons were repeated across multiple optical configurations and experimental setups and consistently confirmed the validity and repeatability of the computational reconstruction approach (see Fig. 2). Results indicate computational wavefront sensing can provide practical wavefront estimation performance and significantly reduce hardware complexity and alignment requirements.

Our methodology has also been used operationally for several years by astronomers to support telescope alignment and optical performance evaluation under real observing conditions—which demonstrates the approach is effective not only within laboratory environments, but also for practical field applications under atmospheric seeing conditions (stability of the Earth’s atmosphere).

Beyond accuracy, the computational architecture also offers several practical advantages over conventional wavefront sensing methods, including reduced hardware complexity, improved robustness to vibration, high dynamic range operation, simplified system integration, and the ability to perform full-field aberration analysis in a single-shot measurement.

Real-time adaptive optics operation

One of the most significant demonstrations of our approach involved real-time adaptive optics operation at up to 2 kHz.2-4 In this implementation, the computational wavefront reconstruction engine operated directly within the adaptive optics control loop to enable high-speed wavefront estimation and correction for dynamic optical environments.

These results demonstrate that computational image-space wavefront sensing is not limited to static or offline metrology applications, but can also support real-time adaptive optics operation traditionally dominated by dedicated hardware wavefront sensors. The ability to perform wavefront reconstruction at kilohertz rates while relying only on standard imaging hardware highlights the potential of computational sensing architectures for adaptive optics, astronomical instrumentation, and other high-speed optical control applications.

Integration with optical metrology workflows

Our computational wavefront sensing framework has been integrated into a range of practical optical metrology workflows through AI4Wave and SkyWave, software platforms developed by Innovations Foresight. Because the methodology relies primarily on image-space information rather than dedicated wavefront sensing hardware, it can be adapted to a wide variety of optical configurations and imaging systems.

The approach has been applied to optical alignment, wavefront metrology, adaptive optics, optical testing, telescope performance evaluation, and autocollimator-based optical setups using both laboratory and field imaging configurations. More generally, the framework can be integrated into systems where defocused imaging data is available from the optical path under test to enable wavefront reconstruction without substantial modification of the existing optical configuration.

One practical implementation integrates AI4Wave with the Point Source Microscope (PSM) from Optical Perspectives Group, to extend the system from alignment functionality into wavefront metrology and optical performance analysis.5

Toward computational optical metrology

The increasing maturity of computational wavefront reconstruction methods opens new possibilities in optical metrology, adaptive optics, and optical instrumentation. By shifting part of the sensing architecture from specialized hardware into computational image reconstruction, these approaches can reduce system complexity and expand deployment flexibility for environments where conventional wavefront sensing hardware may be difficult to integrate, sensitive to vibration, cost prohibitive, or impractical to operate.6

Unlike traditional wavefront sensing architectures that depend on dedicated optical components, computational image-space wavefront sensing can leverage existing imaging hardware and standard optical configurations to extract wavefront information directly from a defocused image. This creates opportunities for more compact, scalable, and field-deployable optical systems across research, industrial, and astronomical applications.

As optical systems continue to evolve toward faster, software-defined, and more integrated architectures, computational image-space wavefront sensing represents a promising direction for next-generation optical metrology and adaptive optics instrumentation.

REFERENCES

1. G. Baudat and J. B. Hayes, Proc. SPIE, 11490, 114900U (Aug. 21, 2020); https://doi.org/10.1117/12.2568018.
2. G. Baudat, D. Lavanchy, and G. Müller, Proc. SPIE, 13373, 133730G (2025); https://doi.org/10.1117/12.3043339.
3. G. Baudat et al., SPIE Astronomical Telescopes + Instrumentation (Jul. 6, 2026).
4. See www.innovationsforesight.com/Wavefront/SubMillisecondAI4Wave_Wavefront_Sensing_1080pHD.mp4.
5. See https://youtu.be/mIlzEisrEMc.
6. G. Baudat and R. E. Parks, Opt. Eng., 64, 4, 044101 (Apr. 2025); https://doi.org/10.1117/1.oe.64.4.044101.

About the Author

Fatiha Anouar

Dr. Fatiha Anouar is cofounder and chief AI officer at Innovations Foresight LLC, where she leads the development and commercialization of computational optics and physics-informed AI technologies for optical metrology, wavefront sensing, and adaptive optics applications. Her work focuses on translating ML-based wavefront reconstruction and computational imaging methods into deployable engineering solutions for scientific, industrial, and optical manufacturing environments.

Gaston Baudat

Dr. Gaston Baudat is cofounder of Innovations Foresight LLC and an adjunct professor at the Wyant College of Optical Sciences at the University of Arizona. His work focuses on computational image-space wavefront sensing, optical metrology, adaptive optics, and ML-based optical measurement methods, including the development of the patented computational wavefront reconstruction approach. He has contributed to multiple SPIE publications about wavefront reconstruction and high-speed adaptive optics systems for scientific and industrial applications.

Sign up for our eNewsletters
Get the latest news and updates

Voice Your Opinion!

To join the conversation, and become an exclusive member of Laser Focus World, create an account today!