Logo

Exploring Redundancy in Deep Neural NetworksRegression of functions on low-dimensional manifolds by deep ReLU networks

Speaker

Wenjing Liao ,Georgia Institute of Technology

Time

2019.12.17 14:00-15:00

Venue

Room 706, No.6 Science Building

Abstract

Many data in real-world applications lie in a high-dimensional space but are concentrated on or near a low-dimensional manifold. Our goal is to estimate functions on the manifold from finite samples of data. This talk focuses on an efficient approximation theory of deep ReLU networks for functions supported on low-dimensional manifolds. We construct a ReLU network for such function approximation where the size of the network grows exponentially with respect to the intrinsic dimension of the manifold. When the function is estimated from finite samples, we proved that the mean squared error for the function approximation converges as the training samples increases with a rate depending on the intrinsic dimension of the manifold. These results demonstrate that deep neural networks are adaptive to low-dimensional geometric structures of data. This is a joint work with Minshuo Chen, Haoming Jiang, Tuo Zhao at Georgia Institute of Technology.

Bio

Dr. Wenjing Liao is an assistant professor in the School of Mathematics at Georgia Tech. She obtained her Ph.D in mathematics at University of California, Davis in 2013, and B.S. at Fudan University in 2008. She was a visiting assistant professor at Duke University from 2013 to 2016, as well as a postdoctoral fellow at Statistical and Applied Mathematical Sciences Institute from 2013 to 2015. She worked at Johns Hopkins University as an assistant research scientist from 2016 to 2017. She works on theory and algorithms in the intersection of applied math, machine learning and signal processing. Her current research interests include deep learning, manifold learning, super-resolution in imaging and signal processing.