A Fundamental Accuracy--Robustness Trade-off in Regression and Classification
Abstract: We derive a fundamental trade-off between standard and adversarial risk in a rather general situation that formalizes the following simple intuition: "If no (nearly) optimal predictor is smooth, adversarial robustness comes at the cost of accuracy." As a concrete example, we evaluate the derived trade-off in regression with polynomial ridge functions under mild regularity conditions. Generalizing our analysis of this example, we formulate a necessary condition under which adversarial robustness can be achieved without significant degradation of the accuracy. This necessary condition is expressed in terms of a quantity that resembles the Poincar\'{e} constant of the data distribution.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.