# On the capacity of computer memory with defects

@article{Heegard1983OnTC, title={On the capacity of computer memory with defects}, author={Chris Heegard and Abbas El Gamal}, journal={IEEE Trans. Inf. Theory}, year={1983}, volume={29}, pages={731-739} }

A computer memory with defects is modeled as a discrete memoryless channel with states that are statistically determined. [... ] Key Method Arimoto-Blahut type algorithms are used to compute the storage capacity. Expand

## 344 Citations

Capacity and coding for memories with real-time noisy defect information at encoder and decoder

- Computer Science
- 1992

The paper investigates the problem of information storage in a defective medium where real-time noisy information is available on the defects at both the encoder and the decoder and shows that this problem can be transformed into Shannon's channel with side information problem.

Capacity and coding for memories with real-time noisy defect information at both sides

- Computer Science[1990] Proceedings. First International Symposium on Uncertainty Modeling and Analysis
- 1990

The paper investigates the problem of information storage in defective media where a real-time noisy information about the defects is available at both the encoder and the decoder and it is shown that in some cases of interest the capacity can be described without employing Shannon strategies.

Explicit capacity achieving codes for defective memories

- Computer Science2015 IEEE International Symposium on Information Theory (ISIT)
- 2015

It is shown how the state of the art capacity achieving codes, in combination with a coset coding and another error correcting code, can be used in order to asymptotically achieve the capacity of the binary defective memory.

On coding for 'stuck-at' defects

- Computer ScienceIEEE Trans. Inf. Theory
- 1987

Additive linear codes for use on the defect channel--a model for computer memories with stuck-at defects--are studied and a reasonably practical convolutional coding scheme is described and simulated.

Partitioned linear block codes for computer memory with 'stuck-at' defects

- Computer ScienceIEEE Trans. Inf. Theory
- 1983

It is shown that partitioned linear block codes achieve the Shannon capacity for a computer memory with symmetric defects and errors.

Coding with Side Information for Radiation-Tolerant Memory Devices

- Computer Science
- 2011

Simulation results show that while coding with complete side information at the encoder offers the most performance gain compared to when coding without side information is used, coding with partial side information can close the gap between the optimal and current approach with- out incurring much additional overhead.

On the capacity of sticky storage devices

- Computer ScienceAT&T Bell Laboratories Technical Journal
- 1984

The problem of finding the maximum error-free, long-term average capacity per cell and cycle for binary cells with either unilateral or symmetric stickiness is solved.

Coding for memory with stuck-at defects

- Computer Science2013 IEEE International Conference on Communications (ICC)
- 2013

An encoding scheme for partitioned linear block codes (PLBC) which mask the stuck-at defects in memories is proposed and an upper bound and the estimate of the probability that masking fails are derived.

Scrubbing with partial side information for radiation-tolerant memory

- Computer Science2010 IEEE Globecom Workshops
- 2010

Alternative coding schemes for scrubbing are investigated, where the channel model depends on the cell states, defective or not, and the encoder uses channel state information (CSI) or side information.

Information representation and coding for flash memories

- Computer Science2009 IEEE Pacific Rim Conference on Communications, Computers and Signal Processing
- 2009

A focus is placed on rewriting codes and rank modulation in flash memories, with a view to addressing many aspects of a successful storage system.

## References

SHOWING 1-10 OF 20 REFERENCES

An error correcting scheme for defective memory

- Computer ScienceIEEE Trans. Inf. Theory
- 1978

A scheme for storing information in a memory system with defective memory ceils using "additive" codes was proposed by Kuznetsov and Tsybakov, and considerably better bounds on the information rate are presented.

An algorithm for computing the capacity of arbitrary discrete memoryless channels

- Computer ScienceIEEE Trans. Inf. Theory
- 1972

A systematic and iterative method of computing the capacity of arbitrary discrete memoryless channels is presented and a few inequalities that give upper and lower bounds on the capacity are derived.

Computation of channel capacity and rate-distortion functions

- Computer ScienceIEEE Trans. Inf. Theory
- 1972

A simple algorithm for computing channel capacity is suggested that consists of a mapping from the set of channel input probability vectors into itself such that the sequence of probability vectors generated by successive applications of the mapping converges to the vector that achieves the capacity of the given channel.

Noiseless coding of correlated information sources

- Computer ScienceIEEE Trans. Inf. Theory
- 1973

The minimum number of bits per character R_X and R_Y needed to encode these sequences so that they can be faithfully reproduced under a variety of assumptions regarding the encoders and decoders is determined.

A coding theorem for the discrete memoryless broadcast channel

- Computer ScienceIEEE Trans. Inf. Theory
- 1979

A coding theorem for the discrete memoryless broadcast channel is proved for the case where no common message is to he transmitted and the result is tight for broadcast channels having one deterministic component.

On source coding with side information via a multiple-access channel and related problems in multi-user information theory

- Computer ScienceIEEE Trans. Inf. Theory
- 1983

A coding theorem is introduced and established for another type of source-channel matching problem, i.e., a system of source coding with side information via a MAC, which can be regarded as an extension of the Ahlswede-Korner-Wyner type noiseless coding system.

Coding Theorems of Information Theory

- Computer ScienceErgebnisse der Mathematik und Ihrer Grenzgebiete
- 1961

This chapter discusses the Discrete Memoryless Channel, a discrete memoryless channel with additive Gaussian noise, and the coding theorem, which states that message sequences on the periphery of the sphere or within a shell adjacent to the boundary should be considered to be discrete.

Principles of digital communication and coding

- MathematicsProceedings of the IEEE
- 1981

The author has achieved to a considerable degree his stated goals of making the book interesting to practicing engineers, useful as a textbook for graduate students, and a starting point for further investigation by researchers.

A proof of the data compression theorem of Slepian and Wolf for ergodic sources (Corresp.)

- Computer Science, MathematicsIEEE Trans. Inf. Theory
- 1975

It is established that the Slepian-Wolf theorem is true without change for arbitrary ergodic processes \{(X_i,Y_i)\}_{i=1}^{\infty} and countably infinite alphabets.