Abstract
In recent work, we described an adaptive entropy-coded predictive vector quantization (PVQ) scheme for images which was shown to be capable of excellent rate-distortion performance and to be surprisingly robust when applied to images outside the training set used in its design. This scheme made use of several entropy-constrained vector quantizers (ECVQ's), each with a corresponding Huffman encoder/decoder pair, embedded in a vector predictive feedback loop. The particular entropy-coded ECVQ in effect for any input image block depended upon the instantaneous occupancy state of a buffer used to interface the resulting variable-length codewords to a fixed-rate transmission or storage channel. This entropy-coded PVQ scheme is a vector extension of previous work on adaptive entropy-coded predictive scalar quantization (PSQ); in particular, 2-D DPCM. The embedded ECVQ in this adaptive entropy-coded PVQ scheme made use of a modification of a recently introduced design algorithm, based upon clustering, which resulted in unstructured codebooks. Unfortunately, the computational complexity associated with this unstructured embedded ECVQ can be substantial. In this paper we describe much simpler versions of this adaptive entropy-coded PVQ scheme where the embedded ECVQ is replaced by a pruned tree-structured VQ (PTSVQ). The resulting encoding scheme is shown to result in drastically reduced complexity at only a small cost in performance. We demonstrate coding results on selected real-world images.
Original language | English |
---|---|
Pages (from-to) | 171-185 |
Number of pages | 15 |
Journal | IEEE Transactions on Communications |
Volume | 41 |
Issue number | 1 |
DOIs | |
State | Published - Jan 1993 |