Brian H. Marcus

Learn More
—We develop methods for analyzing and constructing combined modulation/error-correctiong codes (ECC codes), in particular codes that employ some form of reversed concatenation and whose ECC decoding scheme requires easy access to soft information (e.g., turbo codes, low-density parity-check (LDPC) codes or parity codes). We expand on earlier work of Immink(More)
We describe a digital holographic storage system for the study of noise sources and the evaluation of modulation and error-correction codes. A precision zoom lens and Fourier transform optics provide pixel-to-pixel matching between any input spatial light modulator and output CCD array over magnifications from 0.8 to 3. Holograms are angle multiplexed in(More)
We study the classical problem of noisy constrained capacity in the case of the binary symmetric channel (BSC), namely, the capacity of a BSC whose inputs are sequences chosen from a constrained set. Motivated by a result of Ordentlich and Weissman [In Proceedings of IEEE Information Theory Workshop (2004) 117–122], we derive an asymptotic formula (when the(More)
—We prove that under mild positivity assumptions the entropy rate of a hidden Markov chain varies analytically as a function of the underlying Markov chain parameters. A general principle to determine the domain of analyticity is stated. An example is given to estimate the radius of convergence for the entropy rate. We then show that the positivity(More)
Consider a hidden Markov chain obtained as the observation process of an ordinary Markov chain corrupted by noise. Zuk, et. al. [13, 14] showed how, in principle, one can explicitly compute the derivatives of the entropy rate of at extreme values of the noise. Namely, they showed that the derivatives of standard upper approximations to the entropy rate(More)