NVThermlP modeling of super-resolution algorithms

Eddie Jacobs, Ronald G. Driggers, Susan Young, Keith Krapels, Gene Tener, Jennifer Park

Research output: Contribution to journalConference articlepeer-review

9 Scopus citations

Abstract

Undersampled imager performance enhancement has been demonstrated using super-resolution reconstruction techniques. In these techniques, the optical flow of the scene or the relative sub-pixel shift between frames is calculated and a high-resolution grid is populated with spatial data based on scene motion. Increases in performance have been demonstrated for observers viewing static images obtained from super-resolving a sequence of frames in a dynamic scene and for dynamic framing sensors. In this paper, we provide explicit guidance on how to model super-resolution reconstruction algorithms within existing thermal analysis models such as NVThermlP. The guidance in this paper will be restricted to static target/background scenarios. Background is given on the interaction of sensitivity and resolution in the context of a super-resolution process and how to relate these characteristics to parameters within the model. We then show results from representative algorithms modeled with NVThermlP. General guidelines for analyzing the effects of super-resolution in models are then presented.

Original languageEnglish (US)
Article number19
Pages (from-to)125-135
Number of pages11
JournalProceedings of SPIE - The International Society for Optical Engineering
Volume5784
DOIs
StatePublished - 2005
Externally publishedYes
EventInfrared Imaging Systems: Design, Analysis, Modeling, and Testing XVI - Orlando, FL, United States
Duration: Mar 30 2005Apr 1 2005

ASJC Scopus subject areas

  • Electronic, Optical and Magnetic Materials
  • Condensed Matter Physics
  • Computer Science Applications
  • Applied Mathematics
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'NVThermlP modeling of super-resolution algorithms'. Together they form a unique fingerprint.

Cite this