Logo

A Stein Variational Newton Method

Speaker

Tiangang Cui, Monash University

Time

2020.01.08 11:00-12:00

Venue

Room 305, No.5 Science Building

Abstract

Stein variational gradient descent (SVGD) was recently proposed as a general-purpose nonparametric variational inference algorithm [Liu & Wang, NIPS 2016]: it minimizes the Kullback–Leibler divergence between the target distribution and its approximation by implementing a form of functional gradient descent on a reproducing kernel Hilbert space. We present accelerations and generalization to the SVGD algorithm by including second-order information, thereby approximating a Newton-like iteration in function space. We also show how second-order information can lead to more effective choices of the kernel. We observe significant computational gains over the original SVGD algorithm in multiple test cases.