Envisioning Audio Augmented Reality in Everyday Life
Abstract: While visual augmentation dominates the augmented reality landscape, devices like Meta Ray-Ban audio smart glasses signal growing industry movement toward audio augmented reality (AAR). Hearing is a primary channel for sensing context, anticipating change, and navigating social space, yet AAR's everyday potential remains underexplored. We address this gap through a collaborative autoethnography (N=5, authoring) and an online survey (N=74). We identify ten roles for AAR, grouped into three categories: task- and utility-oriented, emotional and social, and perceptual collaborator. These roles are further layered with a rhythmic and embodied collaborator framing, mapping them onto micro-, meso-, and macro-rhythms of everyday life. Our analysis surfaces nuanced tensions, such as blocking distractions without erasing social presence, highlighting the need for context-aware design. This paper contributes a foundational and forward-looking framework for AAR in everyday life, providing design groundwork for systems attuned to daily routines, sensory engagement, and social expectations.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.