Caltech Home > PMA Home > Master Calendar > Special CMX Seminar
open search form
Monday, December 16, 2019
4:00 PM - 5:00 PM
Annenberg 213

Special CMX Seminar

A New Non-convex Optimization Framework For Low-rank Matrix Recovery with Mathematical Guarantee
Zhenzhen Li, Postdoctoral Student, Department of Mathematics, The Hong Kong University of Science and Technology,
Speaker's Bio:
Zhenzhen Li currently works at the Department of Mathematics, The Hong Kong University of Science and Technology. Zhenzhen does research in Applied Mathematics, especially in data science, provable algorithms for real world problems, matrix analysis, optimization, large scale computing.

Recent years have witnessed growing importance of non-convex methods in many industrial and practical problems. Many low-rank related problems can be solved by reformulating them as nonconvex optimization problems. Surprisingly, these optimization problems usually do not have spurious local minima and all saddle points are strict under the Euclidean parameterized setting. Although a dimension-free polynomial convergence can be guaranteed for many problems, numerical experiments have demonstrated much better performance than what the current theory predicts. Different from previous non-convex methods (with weaker theoretical convergence guarantee) or convex relaxation (with heavier computational cost), in this talk, we will discuss a new global non-convex optimization framework for solving a general inverse problem of low-rank matrices under the Riemannian manifold setting. Given some random measurements of a low-rank matrix, how does a least square loss work via the Riemannian gradient descent (light computational cost version) with random initialization? Our analysis gives rigorous mathematical analysis with respect to both asymptotic convergence behavior and fast convergence rate under isometry or weaker isometry condtions. More specifically, under isometry case a low-rank matrix manifold with rank r (r << n) consists of 2^r branches. We will show that a random initialization with probability 1 falls into an intrinsic branch. Further, it needs O(log n+log(1/epsilon)) iterations to generate an epsilon-accuracy solution. Similar results also hold for low rank matrix recovery by given some random information under mild conditions (weaker isometry case). Potential applications include but not limited to low-rank matrix related problems, such as matrix sensing, matrix completion, low-rank Hankel matrix recovery, phase retrieval, robust PCA, and non-convex flow, etc.

For more information, please contact Jolene Brink by phone at 6263952813 or by email at jbrink@caltech.edu or visit CMX Website.