A Fine-Grained Perspective on Approximating Subset Sum and Partition
Abstract: Approximating Subset Sum is a classic and fundamental problem in computer science and mathematical optimization. The state-of-the-art approximation scheme for Subset Sum computes a $(1-\varepsilon)$-approximation in time $\tilde{O}(\min{n/\varepsilon, n+1/\varepsilon2})$ [Gens, Levner'78, Kellerer et al.'97]. In particular, a $(1-1/n)$-approximation can be computed in time $O(n2)$. We establish a connection to Min-Plus-Convolution, a problem that is of particular interest in fine-grained complexity theory and can be solved naively in time $O(n2)$. Our main result is that computing a $(1-1/n)$-approximation for Subset Sum is subquadratically equivalent to Min-Plus-Convolution. Thus, assuming the Min-Plus-Convolution conjecture from fine-grained complexity theory, there is no approximation scheme for Subset Sum with strongly subquadratic dependence on $n$ and $1/\varepsilon$. In the other direction, our reduction allows us to transfer known lower order improvements from Min-Plus-Convolution to Subset Sum, which yields a mildly subquadratic randomized approximation scheme. This adds the first approximation problem to the list of problems that are equivalent to Min-Plus-Convolution. For the related Partition problem, an important special case of Subset Sum, the state of the art is a randomized approximation scheme running in time $\tilde{O}(n+1/\varepsilon{5/3})$ [Mucha~et~al.'19]. We adapt our reduction from Subset Sum to Min-Plus-Convolution to obtain a related reduction from Partition to Min-Plus-Convolution. This yields an improved approximation scheme for Partition running in time $\tilde{O}(n + 1/\varepsilon{3/2})$. Our algorithm is the first deterministic approximation scheme for Partition that breaks the quadratic barrier.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.