Logo

Exploring Redundancy in Deep Neural Networks

Speaker

Chenglong Bao,Tsinghua University

Time

2019.12.30 14:00-15:00

Venue

Room 305, No.5 Science Building

Abstract

The deep neural networks have been widely used in many applications and the classification accuracy increases as the network goes bigger. However, the huge computation and storage have prevented their deployments in resource-limited devices. In this talk, we will first show that there exists redundancy in current CNNs under the PAC framework. Second, we will propose the self-distillation technique that can compress the deep neural networks with dynamic inference.