ML Seminar: Geometry and Regularization in Nonconvex Low-Rank Estimation

Event Status
Scheduled

Recent years have seen a flurry of activities in designing provably efficient nonconvex procedures for solving statistical estimation problems. The premise is that despite nonconvexity, the loss function may possess benign geometric properties that enable fast global convergence under carefully designed initializations, such as local strong convexity, local restricted convexity, etc. In many sample-starved problems, this benign geometry only holds over a restricted region of the entire parameter space with certain structural properties, yet gradient descent seems to follow a trajectory staying within this nice region without explicit regularizations, thus is extremely computationally efficient and holds strong statistical guarantees. In this talk, we formally establish this “implicit regularization” phenomenon of gradient descent for the fundamental problem of estimating low-rank matrices from noisy incomplete, rank-one, or 1-bit measurements, by exploiting statistical modeling in analyzing iterative optimization algorithms via a leave-one-out perturbation argument.  

More information

https://users.ece.cmu.edu/~yuejiec/

Date and Time
Feb. 1, 2019, All Day