Learn More
The increasing demands on information processing require novel computational concepts and true parallelism. Nevertheless, hardware realizations of unconventional computing approaches never exceeded a marginal existence. While the application of optics in super-computing receives reawakened interest, new concepts, partly neuro-inspired, are being considered(More)
Finding interdependency relations between (possibly multivariate) time series provides valuable knowledge about the processes that generate the signals. Information theory sets a natural framework for non-parametric measures of several classes of statistical dependencies. However, a reliable estimation from information-theoretic functionals is hampered when(More)
Reservoir computing is a paradigm in machine learning whose processing capabilities rely on the dynamical behavior of recurrent neural networks. We present a mixed analog and digital implementation of this concept with a nonlinear analog electronic circuit as a main computational unit. In our approach, the reservoir network can be replaced by a single(More)
Nonlinear photonic delay systems present interesting implementation platforms for machine learning models. They can be extremely fast, offer great degrees of parallelism and potentially consume far less power than digital processors. So far they have been successfully employed for signal processing using the Reservoir Computing paradigm. In this paper we(More)
In this paper an approach to identify delay phenomena from time series is developed. We show that it is possible to perform a reliable time delay identification by using quantifiers derived from information theory, more precisely, permutation entropy and permutation statistical complexity. These quantifiers show clear extrema when the embedding delay τ of(More)
To learn and mimic how the brain processes information has been a major research challenge for decades. Despite the efforts, little is known on how we encode, maintain and retrieve information. One of the hypothesis assumes that transient states are generated in our intricate network of neurons when the brain is stimulated by a sensory input. Based on this(More)
We demonstrate reservoir computing with a physical system using a single autonomous Boolean logic element with time-delay feedback. The system generates a chaotic transient with a window of consistency lasting between 30 and 300 ns, which we show is sufficient for reservoir computing. We then characterize the dependence of computational performance on(More)
An adapted state-of-the-art method of processing information known as Reservoir Computing is used to show its utility on the open and time-consuming problem of heartbeat classification. The MIT-BIH arrhythmia database is used following the guidelines of the Association for the Advancement of Medical Instrumentation. Our approach requires a computationally(More)
We study how reliably generalized synchronization can be detected and characterized from time-series analysis. To that end, we analyze synchronization in a generalized sense of delay-coupled chaotic oscillators in unidirectional ring configurations. The generalized synchronization condition can be verified via the auxiliary system approach; however, in(More)