Deep Complex Networks

@article{Trabelsi2017DeepCN,
  title={Deep Complex Networks},
  author={Chiheb Trabelsi and Olexa Bilaniuk and Dmitriy Serdyuk and Sandeep Subramanian and Jo{\~a}o Felipe Santos and Soroush Mehri and Negar Rostamzadeh and Yoshua Bengio and Christopher Joseph Pal},
  journal={CoRR},
  year={2017},
  volume={abs/1705.09792}
}
At present, the vast majority of building blocks, techniques, and architectures for deep learning are based on real-valued operations and representations. However, recent work on recurrent neural networks and older fundamental theoretical analysis suggests that complex numbers could have a richer representational capacity and could also facilitate noise-robust memory retrieval mechanisms. Despite their attractive properties and potential for opening up entirely new neural architectures, complex… CONTINUE READING
Highly Cited
This paper has 33 citations. REVIEW CITATIONS

Citations

Publications citing this paper.
Showing 1-10 of 20 extracted citations

References

Publications referenced by this paper.
Showing 1-10 of 43 references

Untersuchungen zu dynamischen neuronalen Netzen. PhD thesis, diploma thesis, institut für informatik, lehrstuhl prof. brauer, technische universität münchen

  • Sepp Hochreiter
  • 1991
Highly Influential
4 Excerpts

A method of solving a convex programming problem with convergence rate o (1/k2)

  • Yurii Nesterov
  • 1983
Highly Influential
2 Excerpts

Similar Papers

Loading similar papers…