Stationary probability model for microscopic parallelism in JPEG2000

Francesc Auli-Llinas, Michael W. Marcellin

Research output: Contribution to journalArticlepeer-review

16 Scopus citations

Abstract

Parallel processing is key to augmenting the throughput of image codecs. Despite numerous efforts to parallelize wavelet-based image coding systems, most attempts fail at the parallelization of the bitplane coding engine, which is the most computationally intensive stage of the coding pipeline. The main reason for this failure is the causality with which current coding strategies are devised, which assumes that one coefficient is coded after another. This work analyzes the mechanisms employed in bitplane coding and proposes alternatives to enhance opportunities for parallelism. We describe a stationary probability model that, without sacrificing the advantages of current approaches, removes the main obstacle to the parallelization of most coding strategies. Experimental tests evaluate the coding performance achieved by the proposed method in the framework of JPEG2000 when coding different types of images. Results indicate that the stationary probability model achieves similar coding performance, with slight increments or decrements depending on the image type and the desired level of parallelism.

Original languageEnglish (US)
Article number6746176
Pages (from-to)960-970
Number of pages11
JournalIEEE Transactions on Multimedia
Volume16
Issue number4
DOIs
StatePublished - Jun 2014

Keywords

  • Bitplane image coding
  • JPEG2000
  • parallel architectures
  • probability models

ASJC Scopus subject areas

  • Signal Processing
  • Media Technology
  • Computer Science Applications
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Stationary probability model for microscopic parallelism in JPEG2000'. Together they form a unique fingerprint.

Cite this