—We develop methods for analyzing and constructing combined modulation/error-correctiong codes (ECC codes), in particular codes that employ some form of reversed concatenation and whose ECC decoding scheme requires easy access to soft information (e.g., turbo codes, low-density parity-check (LDPC) codes or parity codes). We expand on earlier work of Immink… (More)
We describe a digital holographic storage system for the study of noise sources and the evaluation of modulation and error-correction codes. A precision zoom lens and Fourier transform optics provide pixel-to-pixel matching between any input spatial light modulator and output CCD array over magnifications from 0.8 to 3. Holograms are angle multiplexed in… (More)
We study the classical problem of noisy constrained capacity in the case of the binary symmetric channel (BSC), namely, the capacity of a BSC whose inputs are sequences chosen from a constrained set. Motivated by a result of Ordentlich and Weissman [In Proceedings of IEEE Information Theory Workshop (2004) 117–122], we derive an asymptotic formula (when the… (More)
—In digital storage systems where the input to the noisy channel is required to satisfy a modulation constraint, the constrained code and error-control code (ECC) are typically designed and decoded independently. The achievable rate for this situation is evaluated as the rate of average intersection of the constraint and the ECC. The gap from the capacity… (More)
This paper provides a self-contained exposition of k modulation code design methods based upon the state splitting algorithm. The techniques are applied to the design of several codes of interest in digital data recording.
—We prove that under mild positivity assumptions the entropy rate of a hidden Markov chain varies analytically as a function of the underlying Markov chain parameters. A general principle to determine the domain of analyticity is stated. An example is given to estimate the radius of convergence for the entropy rate. We then show that the positivity… (More)
Consider a hidden Markov chain obtained as the observation process of an ordinary Markov chain corrupted by noise. Zuk, et. al. [13, 14] showed how, in principle, one can explicitly compute the derivatives of the entropy rate of at extreme values of the noise. Namely, they showed that the derivatives of standard upper approximations to the entropy rate… (More)