A Novel mapping for visual to auditory sensory substitution
Abstract: visual information can be converted into audio stream via sensory substitution devices in order to give visually impaired people the chance of perception of their surrounding easily and simultaneous to performing everyday tasks. In this study, visual environmental features namely, coordinate, type of objects and their size are assigned to audio features related to music tones such as frequency, time duration and note permutations. Results demonstrated that this new method has more training time efficiency in comparison with our previous method named VBTones which sinusoidal tones were applied. Moreover, results in blind object recognition for real objects was achieved 88.05 on average.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.