Learn More
In this paper, we propose a recurrent kernel algorithm with selectively sparse updates for online learning. The algorithm introduces a linear recurrent term in the estimation of the current output. This makes the past information reusable for updating of the algorithm in the form of a recurrent gradient term. To ensure that the reuse of this recurrent(More)
We propose a robust recurrent kernel online learning (RRKOL) algorithm which allows the exploitation of the kernel trick in an online fashion. The novel RRKOL algorithm achieves guaranteed weight convergence with regularized risk management through the recurrent hyper-parameters for a superior generalization performance. To select useful data to be learned(More)
—A frame frequency multiplier system based on Field Programmable Gate Array (FPGA) for time-sequential stereo video is proposed in this paper. The system converts two standard 60 fields/sec interlaced NTSC videos captured by two cameras in different perspective views to a 120 frames/sec time– sequential progressive stereo video. Video synchronization(More)
Training of recurrent neural networks (RNNs) introduces considerable computational complexities due to the need for gradient evaluations. How to get fast convergence speed and low computational complexity remains a challenging and open topic. Besides, the transient response of learning process of RNNs is a critical issue, especially for on-line(More)
Kernel methods are widely used in nonlinear modeling applications. In this paper, a robust information theoretic sparse kernel algorithm is proposed for online learning. In order to reduce the computational cost and make the algorithm suitable for online applications, we investigate an information theoretic sparsification rule based on the mutual(More)