Advisory: Prof. Palle Dahlstedt (Interaction Design, Department of Computer Science and Engineering)
According to my research, I believe I have discovered a new augmentation of sound, which I call "Sonic Augmented Reality." Sonic Augmented Reality, is the technology of combining ”sonic interactions” (convey information and meaning through interactive context) from our technology-driven environment with computer-generated sound information as a second layer on top of our consumption of music.
In this project, I examine the impact of the increasing use of headphones and earpods by people who isolate themselves from the urban sonic environment. I intend to investigate whether "Machine Learning" (ML) with "Urban Sound Classification" is a useful tool to minimize accidents involving urban moving objects such as cars, trams, bicycles, etc. while simultaneously preserving the enjoyment of listening to music.
I began my research by understanding what the sounds are and how they work; therefore, I researched topics such as:
Secondly, I researched what projects are being done in the field of hearing that are relevant to the problem I want to solve. I looked at existing devices and art projects, such as:
Since I wasn't sure how my solution to the problem could look like, I tried four different experiments which may lead me to a potential concept I can further develop. Detailed information about each experiment is included in the research paper.
I created a video prototype showing the learnings from my fourth experiment while conducting further research on a high-fidelity prototype that failed. Based on this, the last test resulted in a list of learnings for this research paper. Below is a prototype of the final product.
The following is an experience prototype that illustrates various scenarios in which Sonic Augmented Reality might be used.