UPDF AI

Distributed Training of Deep Neuronal Networks: Theoretical and Practical Limits of Parallel Scalability

Janis Keuper

2016 · DBLP: journals/corr/Keuper16
arXiv.org · 1 Citations

TLDR

A theoretical analysis and practical evaluation of the main bottlenecks towards a scalable distributed solution for the training of Deep Neuronal Networks show, that the current state of the art approach, using data-parallelized Stochastic Gradient Descent, is quickly turning into a vastly communication bound problem.

Cited Papers
Citing Papers