A novel lossless data compression scheme based on the error correcting Hamming codes

Abstract

This paper introduces a novel lossless binary data compression scheme that is based on the error correcting Hamming codes, namely the HCDC scheme. In this scheme, the binary sequence to be compressed is divided into blocks of n bits length. To utilize the Hamming codes, the block is considered as a Hamming codeword that consists of p parity bits and d data bits (n = d + p). Then each block is tested to find if it is a valid or a non-valid Hamming codeword. For a valid block, only the d data bits preceded by 1 are written to the compressed file, while for a non-valid block all n bits preceded by 0 are written to the compressed file. These additional 1 and 0 bits are used to distinguish the valid and the non-valid blocks during the decompression process. An analytical formula is derived for computing the compression ratio as a function of block size, and fraction of valid data blocks in the sequence. The performance of the HCDC scheme is analyzed, and the results obtained are presented in tables and graphs. Finally, conclusions and recommendations for future works are pointed out. c © 2007 Elsevier Ltd. All rights reserved.

DOI: 10.1016/j.camwa.2007.11.043

Extracted Key Phrases

7 Figures and Tables

Cite this paper

@article{AlBahadili2008ANL, title={A novel lossless data compression scheme based on the error correcting Hamming codes}, author={Hussein Al-Bahadili}, journal={Computers & Mathematics with Applications}, year={2008}, volume={56}, pages={143-150} }