Saturday 9 February 2019

Negative eigenvalues of the Hessian in deep neural networks. (arXiv:1902.02366v1 [cs.LG])

The loss function of deep networks is known to be non-convex but the precise nature of this nonconvexity is still an active area of research. In this work, we study the loss landscape of deep networks through the eigendecompositions of their Hessian matrix. In particular, we examine how important the negative eigenvalues are and the benefits one can observe in handling them appropriately.



from cs updates on arXiv.org http://bit.ly/2GekpyV
//

0 comments:

Post a Comment