Concentration Inequalities for Suprema of Empirical Processes with Dependent Data via Generic Chaining with Applications to Statistical Learning
Abstract: This paper develops a general concentration inequality for the suprema of empirical processes with dependent data. The concentration inequality is obtained by combining generic chaining with a coupling-based strategy. Our framework accommodates high-dimensional and heavy-tailed (sub-Weibull) data. We demonstrate the usefulness of our result by deriving non-asymptotic predictive performance guarantees for empirical risk minimization in regression problems with dependent data. In particular, we establish an oracle inequality for a broad class of nonlinear regression models and, as a special case, a single-layer neural network model. Our results show that empirical risk minimzaton with dependent data attains a prediction accuracy comparable to that in the i.i.d. setting for a wide range of nonlinear regression models.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.