DizzyRNN: Reparameterizing Recurrent Neural Networks for Norm-Preserving Backpropagation

@article{Dorobantu2016DizzyRNNRR,
  title={DizzyRNN: Reparameterizing Recurrent Neural Networks for Norm-Preserving Backpropagation},
  author={Victor D. Dorobantu and Per Andre Stromhaug and Jess Renteria},
  journal={ArXiv},
  year={2016},
  volume={abs/1612.04035}
}
The vanishing and exploding gradient problems are well-studied obstacles that make it difficult for recurrent neural networks to learn long-term time dependencies. We propose a reparameterization of standard recurrent neural networks to update linear transformations in a provably norm-preserving way through Givens rotations. Additionally, we use the absolute value function as an element-wise non-linearity to preserve the norm of backpropagated signals over the entire network. We show that this… CONTINUE READING
5
Twitter Mentions

Similar Papers