in Seminar Computional Science Optimization Machine Learning ~ read.

xxx-Asymmetry Helps: Eigenvalue and Eigenvector Analyses of Asymmetrically Perturbed Low-Rank Matrices

Speaker: Chen Cheng
arXiv:1811.12804

Abstract: This paper is concerned with a curious phenomenon in spectral estimation. Suppose we are interested in a rank-1 and symmetric matrix M⋆∈Rn×n, yet only a randomly perturbed version M is observed. The perturbation,/,noise matrix M−M⋆ is composed of independent and zero-mean entries and is not symmetric. This might arise, for example, when we have two independent samples for each entry of M⋆ and arrange them into an {\em asymmetric} data matrix M. The aim is to estimate the leading eigenvalue and eigenvector of M⋆. Somewhat unexpectedly, our findings reveal that the leading eigenvalue of the data matrix M can be n−−√ times more accurate than its leading singular value in eigenvalue estimation. Further, the perturbation of any linear form of the leading eigenvector of M (e.g.~entrywise eigenvector perturbation) is provably well-controlled. We further provide partial theory for the more general rank-r case; this allows us to accommodate the case when M⋆ is rank-1 but asymmetric, by considering eigen-decomposition of the associated rank-2 dilation matrix. The takeaway message is this: arranging the data samples in an asymmetric manner and performing eigen-decomposition (as opposed to SVD) could sometimes be quite beneficial.