We investigate the use of two-layer networks with the rectified power unit, which is called the $ReLU^k$ activation function, for function and derivative approximation. By extending and calibrating the corresponding Barron space, we show that two-layer networks with the $ReLU^k$ activation function are well-designed to simultaneously approximate an unknown function and its derivatives. When the measurement is noisy, we propose a Tikhonov type regularization method, and provide error bounds when the regularization parameter is chosen appropriately. Several numerical examples support the efficiency of the proposed approach. It is a joint work with Yuanyuan Li (Fudan), Peter Mathe (WIAS) and Sergei V. Pereverzev (RICAM).