Models Parametric Analysis via Adaptive Kernel Learning
Abstract: Any applied mathematical model contains parameters. The paper proposes to use kernel learning for the parametric analysis of the model. The approach consists in setting a distribution on the parameter space, obtaining a finite training sample from this distribution, solving the problem for each parameter value from this sample, and constructing a kernel approximation of the parametric dependence on the entire set of parameter values. The kernel approximation is obtained by minimizing the approximation error on the training sample and adjusting kernel parameters (width) on the same or another independent sample of parameters. This approach to learning complex dependencies is called kernel learning (or kernel SVM). Traditionally, kernel learning is considered in the so-called Reproducing Kernel Hilbert Space (RKHS) with a fixed kernel. The novelty of our approach is that we consider the kernel learning in a broad subspace of square-integrable functions with a corresponding L_2-norm regularization. This subspace contains linear combinations of kernel functions with kernels of different shapes at different data points. The approach essentially uses a derived analytical representation of the L2-norm for kernel functions. Thus the approach substantially extends the flexibility of the traditional kernel SVM for account of the number of adjusted parameters and minimization of the training error not only over weights of kernels but also over their shapes. The important issue of selecting the optimal regularization parameter is resolved by minimizing the test error over this parameter. Numerical illustrations are provided.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.