pretty thingUoP maths seminar (2013-14)



An Efficient Gauss-Newton Algorithm for Symmetric Low-Rank Product Matrix Approximations.

Speaker: Xin Liu (Chinese Academy of Sciences)

Venue: Monday 21st July 2014, 11am, DS2.07 (refreshments provided)

Abstract: We derive and study a Gauss-Newton method for computing a symmetric low-rank product XX', where X ∈ Rn X k for k < n, that is the closest to a given symmetric matrix A ∈ Rn X n in Frobenius norm. When A = B'B (or BB'), this problem essentially reduces to finding a truncated singular value decomposition of B. Our Gauss-Newton method, which has a particularly simple form, shares the same order of iteration-complexity as a gradient method when k << n, but can be significantly faster on a wide range of problems. In this paper, we prove global convergence and a Q-linear convergence rate for this algorithm, and perform numerical experiments on various test problems, including those from recently active areas of matrix completion and robust principal component analysis. Numerical results show that the proposed algorithm is capable of providing considerable speed advantages over Krylov subspace methods on suitable application problems. Moreover, the algorithm possesses a higher degree of concurrency than Krylov subspace methods, thus offering better scalability on modern multi/many-core computers.

Short Bio of speaker: Dr. Xin Liu is an Associate Professor at State Key Laboratory of Scientific and Engineering Computing, Institute of Computational Mathematics and Scientific/Engineering Computing in the Academy of Mathematics and Systems Science, Chinese Academy of Sciences, Beijing, China. He got his BSc on Computational Mathematics from school of Mathematical Sciences at Peking University in 2004 and PhD on Nonlinear Optimization at Chinese Academy of Sciences in 2009. He has been a visiting scholar in many places, including Germany, United States, Hong Kong, Singapore. Dr. Liu's current research interest is computational methods for optimization, mainly focusing on nonlinear least squares, sparse optimization and matrix decompositions.