Papers
Topics
Authors
Recent
Search
2000 character limit reached

Penalized Projected Kernel Calibration for Computer Models

Published 1 Mar 2021 in stat.ME | (2103.00807v3)

Abstract: Projected kernel calibration is a newly proposed frequentist calibration method, which is asymptotic normal and semi-parametric. Its loss function is usually referred to as the PK loss function. In this work, we prove the uniform convergence of PK loss function and show that (1) when the sample size is large, any local minimum point and local maximum point of the $L_2$ loss between the true process and the computer model is a local minimum point of the PK loss function; (2) all the local minima of the PK loss function converge to the same value. These theoretical results imply that it is extremely hard for the projected kernel calibration to identify the global minimum of the $L_2$ loss, i.e. the optimal value of the calibration parameters. To solve this problem, a frequentist method which we term penalized projected kernel calibration method is suggested and analyzed in detail. We prove that the proposed method is as efficient as the projected kernel calibration method. Through an extensive set of numerical simulations, and a real-world case study, we show that the proposed calibration method can accurately estimate the calibration parameters. We also show that its performance compares favorably to other calibration methods regardless of the sample size.

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

Collections

Sign up for free to add this paper to one or more collections.