Wednesday, 26 September 2018

Asynchronous decentralized accelerated stochastic gradient descent. (arXiv:1809.09258v1 [math.OC])

In this work, we introduce an asynchronous decentralized accelerated stochastic gradient descent type of method for decentralized stochastic optimization, considering communication and synchronization are the major bottlenecks. We establish $\mathcal{O}(1/\epsilon)$ (resp., $\mathcal{O}(1/\sqrt{\epsilon})$) communication complexity and $\mathcal{O}(1/\epsilon^2)$ (resp., $\mathcal{O}(1/\epsilon)$) sampling complexity for solving general convex (resp., strongly convex) problems.



from cs updates on arXiv.org https://ift.tt/2Dvoxdc
//

0 comments:

Post a Comment