BYOS: Knowledge-driven Large Language Models Bring Your Own Operating System More Excellent
Abstract: Operating System (OS) kernel tuning involves systematically adjusting kernel configurations to optimize system performance. Despite recent advancements in LLMs, kernel tuning remains a critical challenge due to: (1) the semantic gap between abstract tuning objective and concrete config options, (2) insufficient environmental interaction induces LLM hallucinations, and (3) the rapid evolution of kernel versions. To address these challenges, we propose BYOS, a LLM-powered framework that automates kernel tuning through three key innovations: structured knowledge construction and mapping, knowledge-driven configuration generation, and continuous knowledge maintenance. Extensive experiments show that BYOS achieves 7.1%-155.4% performance improvements over default configurations across standard OS benchmarks and real-world applications, demonstrating structured knowledge representation can overcome key limitations of pure LLM solutions for system optimization. Our code is available at https://github.com/LHY-24/BYOS.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.