All Fields
Computer Science
Medicine
FAQ
Contact
Sign in
Calgary corpus
The Calgary corpus is a collection of text and binary data files, commonly used for comparing data compression algorithms. It was created by Ian…Â
(More)
Wikipedia
Topic mentions per year
Topic mentions per year
1995-2016
0
2
4
1995
2016
Related topics
Related topics
10 relations
Binary data
Canterbury corpus
Gzip
Mutual information
(More)
Broader (1)
Data compression
Related mentions per year
Related mentions per year
1950-2018
1960
1980
2000
2020
Calgary corpus
Data compression
Mutual information
Binary data
VAX
SHA-1
Papers overview
Semantic Scholar uses AI to extract papers important to this topic.
2012
2012
Context Tree Switching
Joel Veness
,
Kee Siong Ng
,
Marcus Hutter
,
Michael H. Bowling
2012 Data Compression Conference
2012
This paper describes the Context Tree Switching technique, a modification of Context Tree Weighting for the prediction of binary…Â
(More)
Is this relevant?
2005
2005
Universal text preprocessing for data compression
Jürgen Abel
,
William John Teahan
IEEE Transactions on Computers
2005
Several preprocessing algorithms for text files are presented which complement each other and which are performed prior to the…Â
(More)
Is this relevant?
2005
2005
Recycling Bits in LZ77-Based Compression
Danny Dubé
,
Vincent Beaudoin
2005
We present a technique that exploits the multiplicity of the ways a text may be encoded using an LZ77-based compression method…Â
(More)
Is this relevant?
2004
2004
A nearly-optimal Fano-based coding algorithm
Luis Rueda
,
B. John Oommen
Inf. Process. Manage.
2004
Statistical coding techniques have been used for a long time in lossless data compression, using methods such as Hu man's…Â
(More)
Is this relevant?
2000
2000
PPM performance with BWT complexity: a fast and effective data compression algorithm
Michelle Effros
Proceedings of the IEEE
2000
This paper introduces a new data compression algorithm. The goal underlying this new code design is to achieve a single lossless…Â
(More)
Is this relevant?
2000
2000
PPM Performance with BWT Complexity: A New Method for Lossless Data Compression
Michelle Effros
Data Compression Conference
2000
This work combines a new fast context-search algorithm with the lossless source coding models of PPM to achieve a lossless data…Â
(More)
Is this relevant?
1997
1997
Semantically Motivated Improvements for PPM Variants
Suzanne Bunton
Comput. J.
1997
The on-line sequence modelling algorithm `Prediction by Partial Matching' (PPM) has set the performance standard in lossless data…Â
(More)
Is this relevant?
1997
1997
A Generalization and Improvement to PPM ' s \ Blending " 1
Suzanne Bunton
1997
The best-performing method in the data compression literature for computing probability estimates of sequences on-line using a su…Â
(More)
Is this relevant?
1997
1997
Text Compression by Context Tree Weighting
Jan Ã…berg
,
Yuri M. Shtarkov
Data Compression Conference
1997
The results of an experimental study of different modifications of the context tree weighting algorithm are described. In…Â
(More)
Is this relevant?
1996
1996
LZP: A New Data Compression Algorithm
Charles Bloom
Data Compression Conference
1996
Introduction A new compression algorithm is presented which may either be considered an improvement to dictionary coding or an…Â
(More)
Is this relevant?