We provide another look at the statistical calibration problem in computer models. This viewpoint is inspired by two overarching practical considerations of computer models: (i) many computer models are inadequate for perfectly modeling physical systems, even with the best-tuned calibration parameters; (ii) only a finite number of data points are available from a physical experiment to calibrate a related computer model. Following this line of thinking, we provide a non-asymptotic theory and derive a prediction-oriented calibration method. Our calibration method minimizes the predictive mean squared error for a finite sample size with statistical guarantees. We introduce an algorithm to perform the proposed calibration method and connect it to existing Bayesian calibration methods. Synthetic and real examples are provided to corroborate the derived theory and illustrate some advantages of the proposed calibration method.