Papers
Topics
Authors
Recent
Search
2000 character limit reached

Selective inference for effect modification via the lasso

Published 22 May 2017 in stat.ME, math.ST, and stat.TH | (1705.08020v4)

Abstract: Effect modification occurs when the effect of the treatment on an outcome varies according to the level of other covariates and often has important implications in decision making. When there are tens or hundreds of covariates, it becomes necessary to use the observed data to select a simpler model for effect modification and then make valid statistical inference. We propose a two stage procedure to solve this problem. First, we use Robinson's transformation to decouple the nuisance parameters from the treatment effect of interest and use machine learning algorithms to estimate the nuisance parameters. Next, after plugging in the estimates of the nuisance parameters, we use the Lasso to choose a low-complexity model for effect modification. Compared to a full model consisting of all the covariates, the selected model is much more interpretable. Compared to the univariate subgroup analyses, the selected model greatly reduces the number of false discoveries. We show that the conditional selective inference for the selected model is asymptotically valid given the rate assumptions in classical semiparametric regression. Extensive simulation studies are conducted to verify the asymptotic results and an epidemiological application is used to demonstrate the method.

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.