Logo

Scaled Relative Graph: A Tool to Understand Many Optimization Methods and Convergence Analysis 85

62b8dc351dbe5ca912b84406c2c994d0b87f8c2c

Speaker

Wotao Yin, Alibaba US, Damo Academy and University of California, Los Angeles, USA

Time

2020.05.08 11:00-12:00

Venue

Online—ZOOM APP

ZOOM Info

Conference ID: 720-182-188
PIN Code: 892408

Abstract

Many iterative optimization algorithms can be thought of as fixed-point iterations of contractive or nonexpansive operators. Traditionally, such algorithms and operators are analyzed analytically, with inequalities. Since Eckstein and Bertsekas (Figure 1, Math Program 55:293-318, 1992), circles and half-spaces have been used to geometrically illustrate operator theoretic notions although the actual analyses, proofs, and the computation of optimal stepsizes were done analytically with inequalities.

In this talk, we formalize a correspondence between common operators (such as proximal mapping and subdifferentials of convex functions) and geometric objects on the complex plane. We use elementary Euclidean geometry to rapidly prove many useful results regarding the convergence of fixed-point iterations and their optimal stepsizes. The formalism maps various classes of operators to sets on the complex plane and also maps algebraic operations such as scaling, inversion, addition, and composition of operators to geometric operations on sets on the complex plane. Equipped with these tools, we use geometric arguments to review classic results and obtain two novel convergence results. This talk includes joint work with Ernest Ryu and Robert Hannah (arXiv:1902.09788) and with Xinmeng Huang and Ernest Ryu (arXiv:arXiv:1912.01593).

Bio

Dr. Wotao Yin received his B.S. in Mathematics and Applied Mathematics from Nanjing University in 2001 and his Ph.D. degree in Operations Research from Columbia University in 2006. He is a Professor with the Department of Mathematics, University of California, Los Angeles, currently on leave at Alibaba US, Damo Academy. His research interests include computational optimization and its applications in signal processing, machine learning, and other data science problems. He invented fast algorithms for sparse optimization and large-scale distributed optimization problems. During 2006–2013, he was at Rice University. He was the NSF CAREER award in 2008, the Alfred P. Sloan Research Fellowship in 2009, and the Morningside Gold Medal in 2016, and has co-authored five papers receiving best paper-type awards.

Sponsors

Video