This talk concerns operator learning for multiscale partial differential equations, where preserving high-frequency information is critical. We develop neural operators based on hierarchical attention or dilated convolution that achieve state-of-the-art performance on multiscale operator learning tasks. Those neural operators enable efficient forward and inverse solutions for multiscale problems. We conduct experiments to evaluate the performance on various datasets, including the multiscale elliptic equation, its inverse problem, Navier-Stokes equation, and Helmholtz equation.