MegaStitch: Robust Large-Scale Image Stitching

Ariyan Zarei, Emmanuel Gonzalez, Nirav Merchant, Duke Pauli, Eric Lyons, Kobus Barnard

Research output: Contribution to journalArticlepeer-review

Abstract

We address fast image stitching for large image collections while being robust to drift due to chaining transformations and minimal overlap between images. We focus on scientific applications where ground-truth accuracy is far more important than visual appearance or projection error, which can be misleading. For common large-scale image stitching use cases, transformations between images are often restricted to similarity or translation. When homography is used in these cases, the odds of being trapped in a poor local minimum and producing unnatural results increases. Thus, for transformations up to affine, we cast stitching as minimizing reprojection error globally using linear least-squares with a few, simple constraints. For homography, we observe that the global affine solution provides better initialization for bundle adjustment compared to an alternative that initializes with a homography-based scaffolding and at lower computational cost. We evaluate our methods on a very large translation dataset with limited overlap as well as four drone datasets. We show that our approach is better compared to alternative methods such as MGRAPH in terms of computational cost, scaling to large numbers of images, and robustness to drift. We also contribute ground-truth datasets for this endeavor.

Original languageEnglish (US)
Article number4408309
JournalIEEE Transactions on Geoscience and Remote Sensing
Volume60
DOIs
StatePublished - 2022
Externally publishedYes

Keywords

  • Bundle adjustment
  • image geocorrection
  • image stitching
  • linear least squares
  • remote sensing

ASJC Scopus subject areas

  • Electrical and Electronic Engineering
  • Earth and Planetary Sciences(all)

Fingerprint

Dive into the research topics of 'MegaStitch: Robust Large-Scale Image Stitching'. Together they form a unique fingerprint.

Cite this