When is it worthwhile to jackknife? Breaking the quadratic barrier for Z-estimators
Abstract: Resampling methods are especially well-suited to inference with estimators that provide only "black-box'' access. Jackknife is a form of resampling, widely used for bias correction and variance estimation, that is well-understood under classical scaling where the sample size $n$ grows for a fixed problem. We study its behavior in application to estimating functionals using high-dimensional $Z$-estimators, allowing both the sample size $n$ and problem dimension $d$ to diverge. We begin showing that the plug-in estimator based on the $Z$-estimate suffers from a quadratic breakdown: while it is $\sqrt{n}$-consistent and asymptotically normal whenever $n \gtrsim d2$, it fails for a broad class of problems whenever $n \lesssim d2$. We then show that under suitable regularity conditions, applying a jackknife correction yields an estimate that is $\sqrt{n}$-consistent and asymptotically normal whenever $n\gtrsim d{3/2}$. This provides strong motivation for the use of jackknife in high-dimensional problems where the dimension is moderate relative to sample size. We illustrate consequences of our general theory for various specific $Z$-estimators, including non-linear functionals in linear models; generalized linear models; and the inverse propensity score weighting (IPW) estimate for the average treatment effect, among others.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.