2000 character limit reached
Reproducing Kernel Banach Spaces with the l1 Norm II: Error Analysis for Regularized Least Square Regression
Published 24 Jan 2011 in stat.ML, cs.LG, and math.FA | (1101.4439v2)
Abstract: A typical approach in estimating the learning rate of a regularized learning scheme is to bound the approximation error by the sum of the sampling error, the hypothesis error and the regularization error. Using a reproducing kernel space that satisfies the linear representer theorem brings the advantage of discarding the hypothesis error from the sum automatically. Following this direction, we illustrate how reproducing kernel Banach spaces with the l1 norm can be applied to improve the learning rate estimate of l1-regularization in machine learning.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.