Papers
Topics
Authors
Recent
Search
2000 character limit reached

Robotic Blended Sonification: Consequential Robot Sound as Creative Material for Human-Robot Interaction

Published 22 Apr 2024 in cs.HC, cs.RO, cs.SD, and eess.AS | (2404.13821v1)

Abstract: Current research in robotic sounds generally focuses on either masking the consequential sound produced by the robot or on sonifying data about the robot to create a synthetic robot sound. We propose to capture, modify, and utilise rather than mask the sounds that robots are already producing. In short, this approach relies on capturing a robot's sounds, processing them according to contextual information (e.g., collaborators' proximity or particular work sequences), and playing back the modified sound. Previous research indicates the usefulness of non-semantic, and even mechanical, sounds as a communication tool for conveying robotic affect and function. Adding to this, this paper presents a novel approach which makes two key contributions: (1) a technique for real-time capture and processing of consequential robot sounds, and (2) an approach to explore these sounds through direct human-robot interaction. Drawing on methodologies from design, human-robot interaction, and creative practice, the resulting 'Robotic Blended Sonification' is a concept which transforms the consequential robot sounds into a creative material that can be explored artistically and within application-based studies.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (12)
  1. 2006. Auditory and other non-verbal expressions of affect for robots. In AAAI fall symposium: aurally informed performance, 1–5.
  2. Cage, J. 1975. Child of Tree. Peters Edition EP 66685. https://www.johncage.org/pp/John-Cage-Work-Detail.cfm?work_ID=40.
  3. del Castello, G. 2023. RobotExMachina. GitHub repository. https://github.com/RobotExMachina.
  4. 2018. Perception of mechanical sounds inherent to expressive gestures of a nao robot-implications for movement sonification of humanoids.
  5. Giannini, N. 2015. Inner Out. Nicola Giannini. https://www.nicolagiannini.com/portfolio/inner-out-2/.
  6. 2017. Making noise intentional: A study of servo sound perception. In Proceedings of the 2017 ACM/IEEE International Conference on Human-Robot Interaction, HRI ’17, 12–21. New York, NY, USA: Association for Computing Machinery.
  7. 2023. The robot soundscape. In Cultural Robotics: Social Robots and Their Emergent Cultural Ecologies. Springer. 35–65.
  8. 2021. Smooth operator: Tuning robot perception through artificial movement sound. In Proceedings of the 2021 ACM/IEEE International Conference on Human-Robot Interaction, HRI ’21, 53–62. New York, NY, USA: Association for Computing Machinery.
  9. 2018. The sound or silence: investigating the influence of robot noise on proxemics. In 2018 27th IEEE international symposium on robot and human interactive communication (RO-MAN), 713–718. IEEE.
  10. 2013. Blended sonification: Sonification for casual interaction. In ICAD 2013-Proceedings of the International Conference on Auditory Display.
  11. Van Egmond, R. 2008. The experience of product sounds. In Product experience. Elsevier. 69–89.
  12. 2020. Robot gesture sonification to enhance awareness of robot status and enjoyment of interaction. In 2020 29th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), 978–985. IEEE.

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 2 tweets with 0 likes about this paper.