Papers
Topics
Authors
Recent
Search
2000 character limit reached

GUIDE: LLM-Driven GUI Generation Decomposition for Automated Prototyping

Published 28 Feb 2025 in cs.SE | (2502.21068v1)

Abstract: GUI prototyping serves as one of the most valuable techniques for enhancing the elicitation of requirements and facilitating the visualization and refinement of customer needs. While GUI prototyping has a positive impact on the software development process, it simultaneously demands significant effort and resources. The emergence of LLMs with their impressive code generation capabilities offers a promising approach for automating GUI prototyping. Despite their potential, there is a gap between current LLM-based prototyping solutions and traditional user-based GUI prototyping approaches which provide visual representations of the GUI prototypes and direct editing functionality. In contrast, LLMs and related generative approaches merely produce text sequences or non-editable image output, which lacks both mentioned aspects and therefore impede supporting GUI prototyping. Moreover, minor changes requested by the user typically lead to an inefficient regeneration of the entire GUI prototype when using LLMs directly. In this work, we propose GUIDE, a novel LLM-driven GUI generation decomposition approach seamlessly integrated into the popular prototyping framework Figma. Our approach initially decomposes high-level GUI descriptions into fine-granular GUI requirements, which are subsequently translated into Material Design GUI prototypes, enabling higher controllability and more efficient adaption of changes. To efficiently conduct prompting-based generation of Material Design GUI prototypes, we propose a retrieval-augmented generation approach to integrate the component library. Our preliminary evaluation demonstrates the effectiveness of GUIDE in bridging the gap between LLM generation capabilities and traditional GUI prototyping workflows, offering a more effective and controlled user-based approach to LLM-driven GUI prototyping. Video: https://youtu.be/C9RbhMxqpTU

Summary

  • The paper demonstrates how decomposing textual requirements into granular GUI elements using LLMs and RAG significantly improves prototyping speed and quality.
  • It integrates a Figma plugin with a backend LLM pipeline to map high-level descriptions to Material Design components in an editable workflow.
  • Empirical evaluation shows that GUIDE outperforms manual prototype generation in user satisfaction and efficiency metrics in controlled studies.

GUIDE: LLM-Driven Decomposition for Automated GUI Prototyping

Motivation and Problem Formulation

The paper addresses the limitations of current LLM-based GUI prototyping by targeting the disconnect between textual code generation and the visual, editable prototyping workflows prevailing in industry-standard tools like Figma. State-of-the-art approaches typically output GUI specifications in textual DSLs or static images, lacking the visual interactivity and fine-grained editability needed by designers. Furthermore, direct LLM approaches require inefficient full regeneration of prototypes upon minor user modifications, impeding agile iterative design cycles. The work posits that bridging this gap necessitates a decomposition-based generation strategy integrated natively with established GUI prototyping environments.

Decomposition-Based Architecture

GUIDE's architecture operationalizes a modular, LLM-driven pipeline within Figma, built on two central components: a user-facing Figma plugin and a backend orchestration pipeline. The workflow commences with the input of high-level textual requirements, which are decomposed into granular, atomic GUI features using zero-shot prompting with state-of-the-art LLMs (GPT-4o). This structured set of features is surfaced to users for further curation, supporting edit, add, and delete operations—mirroring iterative design flows.

Each atomic feature is mapped to appropriate GUI component types drawn from a curated Material Design library. To mitigate token and specification bloat while maximizing coverage, GUIDE employs Retrieval-Augmented Generation (RAG): a two-stage prompting protocol where the model is first primed with a simplified component taxonomy to select relevant primitives, and only the minimal required full specifications are retrieved for final implementation synthesis. All representation and interaction between GUIDE and Figma are mediated through a custom JSON schema, validated against schema constraints to ensure robustness.

Integration with Figma and Component Curation

Seamlessly embedding GUIDE within Figma is a central contribution, as it allows editable, auto-generated prototypes to be instantiated natively as composable Figma entities. The system leverages the Material Design 3 library, exposing over 59 component archetypes. By mapping fine-grained decomposed requirements to concrete GUI controls using RAG, GUIDE ensures style and interactional consistency, expanding on prior DSL-bound or low-fidelity generation schemes. The decoupling of decomposition, component selection, and instantiation provides enhanced controllability and targeted updates—addressing the scalability and maintainability requirements of agile prototyping.

Evaluation and Empirical Findings

The authors present a controlled lab-based between-subjects study employing GUIDE versus a Figma-only baseline. With 11 participants generating prototypes from standard app descriptions, GUIDE users produced more prototypes per unit time (3.2 vs 2.33 per participant) despite identical 45-minute constraints. Qualitative assessments by 28 UI/UX-expert crowd-workers (via Prolific) yielded statistically significant improvements (Wilcoxon rank sum, p<0.05p < 0.05 for all metrics) on multi-dimensional Likert scales: requirement satisfaction, component appropriateness, textual fit, consistency, visual appeal, information organization, interaction intuitiveness, reduced errors, and overall satisfaction.

All mean and median scores for GUIDE-generated prototypes were markedly higher than for manual Figma efforts, with median requirement fit, textual correspondence, and consistency reaching values of 8, 7, and 7 (vs. 5, 6, and 5 for control). This empirical evidence supports the claim that LLM-driven decomposition integrated with RAG and visual editors can substantially and repeatably enhance prototyping quality and throughput.

Positioning and Implications

GUIDE extends the automated prototyping literature beyond prior work such as Instigator [brie2023evaluating], GUI generation DSLs [feng2023designing, yuan2024maxprototyper], and wireframe search [kolthoff2023data], by enabling high-fidelity, natively editable visual prototypes compatible with production design environments. The modular decomposition approach, inspired by recent advances in chain-of-thought prompting and decomposed reasoning [khot2022decomposed], provides greater transparency and editability over monolithic, end-to-end LLM code generation. The application of RAG within GUI synthesis is technically notable, efficiently integrating large-scale component libraries without exceeding LLM context limitations.

Practically, GUIDE has immediate implications for shortening requirements-to-prototype cycles in industrial UI/UX workflows, improving stakeholder-customer alignment via tangible artifacts, and lowering manual engineering overhead. Theoretically, GUIDE evidences the utility of decomposed prompting and retrieval-augmented modeling for multi-modal, structured generation tasks, opening avenues for richer hybrid human-AI workflows in design and development.

Future Directions

The paper anticipates extending GUIDE by introducing multi-variant generation for GUI features, enabling design space exploration and comparative prototyping within the same pipeline. There are clear extensions into automated usability evaluation, advanced multimodal prototyping (e.g., integrating diffusion models for assets), and generalized support for additional component paradigms. Integration with real-time collaborative revision and downstream handover to implementation pipelines are promising axes for future AI-augmented design systems.

Conclusion

GUIDE articulates and implements a structured, LLM-driven approach to GUI prototyping that is natively compatible with established design workflows. By formalizing a granular decomposition of requirements, integrating retrieval for scalable component curation, and supporting fine-grained, editable prototype synthesis, it demonstrates substantial improvements in both quality and workflow fit over conventional and prior automated methods. The system exemplifies how modern LLM and retrieval strategies can be operationalized in practical, high-impact, multi-modal software engineering tools, and suggests a roadmap for further intelligent automation in human-centered design workflows (2502.21068).

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Collections

Sign up for free to add this paper to one or more collections.