Content Tags

There are no tags.

Sample Complexity of Stochastic Variance-Reduced Cubic Regularization for Nonconvex Optimization.

RSS Source
Authors
Zhe Wang, Yi Zhou, Yingbin Liang, Guanghui Lan

The popular cubic regularization (CR) method converges with first- andsecond-order optimality guarantee for nonconvex optimization, but encounters ahigh sample complexity issue for solving large-scale problems. Varioussub-sampling variants of CR have been proposed to improve the samplecomplexity.In this paper, we propose a stochastic variance-reducedcubic-regularized (SVRC) Newton's method under both sampling with and withoutreplacement schemes. We characterize the per-iteration sample complexity boundswhich guarantee the same rate of convergence as that of CR for nonconvexoptimization. Furthermore, our method achieves a total Hessian samplecomplexity of $\mathcal{O}(N^{8/11} \epsilon^{-3/2})$ and$\mathcal{O}(N^{3/4}\epsilon^{-3/2})$ respectively under sampling without andwith replacement, which improve that of CR as well as other sub-samplingvariant methods via the variance reduction scheme. Our result also suggeststhat sampling without replacement yields lower sample complexity than that ofsampling with replacement. We further compare the practical performance of SVRCwith other cubic regularization methods via experiments.

Stay in the loop.

Subscribe to our newsletter for a weekly update on the latest podcast, news, events, and jobs postings.