Document Type

Conference Paper

Publication Date


Publication Source

Proceedings of SPIE (SPIE Defense + Commercial Sensing, 2020)


In long-range imaging regimes, atmospheric turbulence degrades image quality. In addition to blurring, the turbulence causes geometric distortion effects that introduce apparent motion in acquired video. This is problematic for image processing tasks, including image enhancement and restoration (e.g., superresolution) and aided target recognition (e.g., vehicle trackers). To mitigate these warping effects from turbulence, it is necessary to distinguish between actual in-scene motion and apparent motion caused by atmospheric turbulence. Previously, the current authors generated a synthetic video by injecting moving objects into a static scene and then applying a well-validated anisoplanatic atmospheric optical turbulence simulator. With known per-pixel truth of all moving objects, a per-pixel Gaussian mixture model (GMM) was developed as a baseline technique. In this paper, the baseline technique has been modified to improve performance while decreasing computational complexity. Additionally, the technique is extended to patches such that spatial correlations are captured, which results in further performance improvement.



Document Version

Published Version


The document is provided in compliance with the publisher's policy on self-archiving. Permission documentation is on file. To view the paper on the publisher's website, use the DOI:




Anisoplanatic imaging, Atmospheric turbulence simulator, Long range imaging, Motion segmentation, Patch-based Gaussian mixture model, Turbulence mitigation, University of Dayton Electro-optics and Photonics