Sparse evolutionary deep learning with over one million artificial neurons on commodity hardware
@article{Liu2020SparseED, title={Sparse evolutionary deep learning with over one million artificial neurons on commodity hardware}, author={S. Liu and D. Mocanu and Amarsagar Reddy Ramapuram Matavalam and Y. Pei and M. Pechenizkiy}, journal={Neural Computing and Applications}, year={2020}, pages={1-16} }
Artificial neural networks (ANNs) have emerged as hot topics in the research community. Despite the success of ANNs, it is challenging to train and deploy modern ANNs on commodity hardware due to the ever-increasing model size and the unprecedented growth in the data volumes. Particularly for microarray data, the very high dimensionality and the small number of samples make it difficult for machine learning techniques to handle. Furthermore, specialized hardware such as graphics processing unit… CONTINUE READING
Supplemental Code
Github Repo
Via Papers with Code
Always sparse. Never dense. But never say never. A repository for the Adaptive Sparse Connectivity concept and its algorithmic instantiation, i.e. Sparse Evolutionary Training, to boost Deep Learning scalability on various aspects (e.g. memory and computational time efficiency, representation and generalization power).
Figures, Tables, and Topics from this paper
11 Citations
Artificial Neural Networks Training Acceleration Through Network Science Strategies
- Computer Science
- NUMTA
- 2019
- 1
- PDF
Artificial neural networks training acceleration through network science strategies
- Computer Science
- Soft Comput.
- 2020
- 1
- PDF
On improving deep learning generalization with adaptive sparse connectivity
- Computer Science
- ArXiv
- 2019
- 2
- PDF
Exposing Hardware Building Blocks to Machine Learning Frameworks
- Computer Science, Mathematics
- ArXiv
- 2020
- PDF
Deep Learning on Computational-Resource-Limited Platforms: A Survey
- Computer Science
- Mob. Inf. Syst.
- 2020
- 1
- PDF
References
SHOWING 1-10 OF 88 REFERENCES
Scalable training of artificial neural networks with adaptive sparse connectivity inspired by network science
- Medicine, Computer Science
- Nature Communications
- 2018
- 122
- PDF
Parameter Efficient Training of Deep Convolutional Neural Networks by Dynamic Sparse Reparameterization
- Computer Science, Mathematics
- ICML
- 2019
- 53
- PDF
On improving deep learning generalization with adaptive sparse connectivity
- Computer Science
- ArXiv
- 2019
- 2
- PDF
NeST: A Neural Network Synthesis Tool Based on a Grow-and-Prune Paradigm
- Computer Science
- IEEE Transactions on Computers
- 2019
- 91
- PDF
Sparse Networks from Scratch: Faster Training without Losing Performance
- Computer Science, Mathematics
- ArXiv
- 2019
- 59
- PDF