In this talk, we address the metric matrix learning problem and its applications in classification and data retrieval. First, we formulate some distance metric learning model by considering the metric information of inner- and inter-classes and convert the model to a minimization problem whose variable is symmetric positive definite matrix. Moreover, we deduce an intrinsic steepest descent method in implementation which assures that the metric matrix is strictly symmetric positive definite at each iteration, with the manifold structure of the symmetric positive definite matrix manifold. Then, we extend our model from linear to nonlinear cases. Finally, we test the proposed algorithm on conventional datasets.