Back to List
πŸ“„Papers

Generative and Malleable User Interfaces with Generative AI

Cao Y. et al.
2025-03

About

CHI 2025 paper that proposes using AI to generate "generative and malleable user interfaces" based on user tasks. The authors designed a task-driven data model where LLMs parse user prompts and generate UI specifications, then map them to concrete interfaces. Experiments demonstrate that this method can dynamically generate interface elements such as forms and visualizations, supporting users to modify interfaces via natural language.

Summary

Task-Driven UI Generation Approach:

1. Core Concept: "Malleable" interfaces that users can reshape through natural language
2. Data Model: Task-driven architecture where LLMs interpret user intent and generate UI specs
3. Dynamic Elements: Supports generation of forms, visualizations, and interactive components
4. Natural Language Modification: Users can iteratively refine interfaces using conversational commands
5. Specification Mapping: Abstract UI specs are mapped to concrete interface implementations

Key Innovation: Bridges the gap between user intent and interface realization through task-centric prompting

Tags

chi-2025malleable-uitask-drivenllm

Related Articles

A curated collection of Generative UI resources.

Made with ❀️ by the community