Papers
Topics
Authors
Recent
Search
2000 character limit reached

A Quantum Annealing Approach for Solving Optimal Feature Selection and Next Release Problems

Published 17 Jun 2025 in cs.SE and quant-ph | (2506.14129v2)

Abstract: Search-based software engineering (SBSE) addresses critical optimization challenges in software engineering, including the next release problem (NRP) and feature selection problem (FSP). While traditional heuristic approaches and integer linear programming (ILP) methods have demonstrated efficacy for small to medium-scale problems, their scalability to large-scale instances remains unknown. Here, we introduce quantum annealing (QA) as a subroutine to tackling multi-objective SBSE problems, leveraging the computational potential of quantum systems. We propose two QA-based algorithms tailored to different problem scales. For small-scale problems, we reformulate multi-objective optimization (MOO) as single-objective optimization (SOO) using penalty-based mappings for quantum processing. For large-scale problems, we employ a decomposition strategy guided by maximum energy impact (MEI), integrating QA with a steepest descent method to enhance local search efficiency. Applied to NRP and FSP, our approaches are benchmarked against the heuristic NSGA-II and the ILP-based $\epsilon$-constraint method. Experimental results reveal that while our methods produce fewer non-dominated solutions than $\epsilon$-constraint, they achieve significant reductions in execution time. Moreover, compared to NSGA-II, our methods deliver more non-dominated solutions with superior computational efficiency. These findings underscore the potential of QA in advancing scalable and efficient solutions for SBSE challenges.

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.