Backpropagation (BP) algorithm is one of the most basic learning algorithms in deep learning. Although BP has been widely used, it still suffers from the problem of easily falling into the local minima due to its gradient dynamics. Inspired by the fact that the learning of real brains may exploit chaotic dynamics, we propose the chaotic backpropagation (CBP) algorithm by integrating the intrinsic chaos of real neurons into BP. By validating on multiple datasets (e.g. cifar10), we show that, for multilayer perception (MLP), CBP has significantly better abilities than those of BP and its variants in terms of optimization and generalization from both computational and theoretical viewpoints. Actually, CBP can be regarded as a general form of BP with global searching ability inspired by the chaotic learning process in the brain. Therefore, CBP not only has the potential of complementing or replacing BP in deep learning practice, but also provides a new way for understanding the learning process of the real brain.