Hidenori Naganuma

We don’t have enough information about this author to calculate their statistics. If you think this is an error let us know.
Learn More
We propose a digital version of the backpropagation algorithm (DBP) for three-layered neural networks with nondifferentiable binary units. This approach feeds teacher signals to both the middle and output layers, whereas with a simple perceptron, they are given only to the output layer. The additional teacher signals enable the DBP to update the coupling(More)
— The error back propagation through time (called BPTT) is a learning method of the recurrent neural network. The network learned by BPTT can solve the dynamical problem with the time series data. However, it is not able to directly solve a digital type problem where the middle layer output is intrinsically binary such as the internal state inference of the(More)
  • 1