Hyperspectral image compression with modified 3D SPECK

Ruzelita Ngadiran, Said Boussakta, Ahmed Bouridane, B. Syarif

Research output: Chapter in Book/Report/Conference proceedingChapterpeer-review

5 Citations (Scopus)

Abstract

Hyperspectral image consist of a set of contiguous images bands collected by a hyperspectral sensor. The large amount of data of hyperspectral images emphasizes the importance of efficient compression for storage and transmission. This paper proposes the simplified version of the three dimensional Set Partitioning Embedded bloCK (3D SPECK) algorithm for lossy compression of hyperspectral image. A three dimensional discrete wavelet transform (3D DWT) can fully exploit the interband correlation in a volumetric block. This provides the obvious way to process data such as with hyperspectral images. The 3D structure of the SPECK algorithm will suitably extend the exploitation of interband dependence and correlation. The proposed algorithm maintains the 3D SPECK algorithm with modification of the implementation without using any list to store the significant information. This would reduce the memory requirement of the embedded coding algorithm and improve the coding time to compress the images.
Original languageEnglish
Title of host publicationProceedings of the 7th International Symposium on Communication Systems Networks and Digital Signal Processing (CSNDSP 2010)
EditorsFary Ghassemlooy, Wai Pang Ng
Place of PublicationPiscataway, NJ
PublisherIEEE
Pages806-810
ISBN (Print)9781424488582
Publication statusPublished - 2010
EventCSNDSP 2010: 7th International Symposium on Communication Systems, Networks and Digital Signal Processing - Northumbria University, Newcastle upon Tyne, UK
Duration: 1 Jul 2010 → …

Conference

ConferenceCSNDSP 2010: 7th International Symposium on Communication Systems, Networks and Digital Signal Processing
Period1/07/10 → …

Fingerprint

Dive into the research topics of 'Hyperspectral image compression with modified 3D SPECK'. Together they form a unique fingerprint.

Cite this