Learn More
A connection between the theory of neural networks and cryptography is presented. A new phenomenon, namely synchronization of neural networks is leading to a new method of exchange of secret messages. Numerical simulations show that two artificial networks being trained by Hebbian learning rule on their mutual outputs develop an antiparallel state of their(More)
Two neural networks that are trained on their mutual output synchronize to an identical time dependant weight vector. This novel phenomenon can be used for creation of a secure cryptographic secret-key using a public channel. Several models for this cryptographic system have been suggested, and have been tested for their security under different(More)
— Finding the entropy rate of Hidden Markov Processes is an active research topic, of both theoretical and practical importance. A recently used approach is studying the asymptotic behavior of the entropy rate in various regimes. In this paper we generalize and prove a previous conjecture relating the entropy rate to entropies of finite systems. Building on(More)
Networks of nonlinear units with time-delayed couplings can synchronize to a common chaotic trajectory. Although the delay time may be very large, the units can synchronize completely without time shift. For networks of coupled Bernoulli maps, analytic results are derived for the stability of the chaotic synchronization manifold. For a single delay time,(More)
—Discrete-input two-dimensional (2-D) Gaussian channels with memory represent an important class of systems, which appears extensively in communications and storage. In spite of their widespread use, the workings of 2-D channels are still very much unknown. In this work, we try to explore their properties from the perspective of estimation theory and(More)
The learning time of a simple neural network model is obtained through an analytic computation of the eigenvalue spectrum for the Hessian matrix, which describes the second order properties of the cost function in the space of coupling coefficients. The form of the eigenvalue distribution suggests new techniques for accelerating the learning process, and(More)
A neural network which is capable of recalling without errors any set of linearly independent patterns is studied. The network is based on a Hamiltonian version of the model of Personnaz et al. The energy of a state of N (+1}neurons is the square of the Euclidean distance — in phase space— between the state and the linear subspace spanned by the patterns.(More)
— A recent result presented the expansion for the entropy rate of a Hidden Markov Process (HMP) as a power series in the noise variable ǫ. The coefficients of the expansion around the noiseless (ǫ = 0) limit were calculated up to 11th order, using a conjecture that relates the entropy rate of a HMP to the entropy of a process of finite length (which is(More)
Mutual learning of a pair of tree parity machines with continuous and discrete weight vectors is studied analytically. The analysis is based on a mapping procedure that maps the mutual learning in tree parity machines onto mutual learning in noisy perceptrons. The stationary solution of the mutual learning in the case of continuous tree parity machines(More)