About Speakers Schedule INS
反问题与不确定性量化研讨会 (Workshop on Inverse Problems and Uncertainty Quantification)

Two-layer networks with the $ReLU^k$ activation function: Barron spaces and derivative approximation

Speaker

陆帅 , 复旦大学

Time

18 Nov, 08:40 - 09:10

Abstract

We investigate the use of two-layer networks with the rectified power unit, which is called the $ReLU^k$ activation function, for function and derivative approximation. By extending and calibrating the corresponding Barron space, we show that two-layer networks with the $ReLU^k$ activation function are well-designed to simultaneously approximate an unknown function and its derivatives. When the measurement is noisy, we propose a Tikhonov type regularization method, and provide error bounds when the regularization parameter is chosen appropriately. Several numerical examples support the efficiency of the proposed approach. It is a joint work with Yuanyuan Li (Fudan), Peter Mathe (WIAS) and Sergei V. Pereverzev (RICAM).