Abstract
The optical absorption of thin-film thermal infrared detectors was calculated as a function of wavelength, pixel size, and area fill factor by use of the finite-difference time-domain (FDTD) method. The results indicate that smaller pixels absorb a significantly higher percentage of incident energy than larger pixels with the same fill factor. A polynomial approximation to the FDTD results was derived for use in system models.
Original language | English (US) |
---|---|
Pages (from-to) | 280-282 |
Number of pages | 3 |
Journal | Optics letters |
Volume | 26 |
Issue number | 5 |
DOIs | |
State | Published - Mar 1 2001 |
Externally published | Yes |
ASJC Scopus subject areas
- Atomic and Molecular Physics, and Optics