Category: Interaction Design

  • Echo Location: Navigating Sonic Interaction Design with Professor Myounghoon Jeon

    Professor Myounghoon “Philart” Jeon, a professor at Virginia Tech, recently delivered an engaging online guest lecture on sonic information design, where he explored the intersection of auditory perception, cognitive science, and interactive sound design. His research spans auditory displays, human-computer interaction, and affective computing, with applications in assistive technologies, automotive interfaces, and interactive performance. Throughout the lecture, he shared detailed insights into the process of designing and evaluating auditory cues, explaining how specific sound design choices impact usability, accessibility, and engagement.

    Myounghoon "Philart" Jeon

    The Evolution of Sonic Information Design

    Professor Jeon introduced sonic information design as a field that integrates sonification, auditory displays, auditory user interfaces, and sonic interaction design. While sound design has historically been guided by artistic intuition, his work highlights a shift towards scientific, data-driven approaches. This transition ensures that auditory interfaces are both intuitive and efficient, optimising interaction in hands-free, visually demanding, or multi-tasking environments.

    One example of this approach is his development of “Spindex” (Speech Index), an auditory menu navigation system that enhances efficiency by using compressed speech cues instead of full words. Instead of users listening to long, spoken menu options, Spindex provides shortened speech cues, allowing them to scan options quickly. Through user testing, he found that people could navigate menus more effectively when exposed to a combination of compressed speech and indexed categories, rather than traditional text-to-speech output. The decision to use speech compression without pitch alteration ensured that the information remained intelligible while increasing the speed of interaction.

    Applications of Auditory Displays

    Professor Jeon discussed a range of applications where sound enhances usability and accessibility, particularly in assistive technology, automotive sound design, and interactive exhibitions. One of his most practical and tested projects focused on indoor navigation for visually impaired users. His team developed a wearable navigation system that incorporates ultrasonic belts providing both tactile and auditory feedback. The sound design choices involved creating gradual frequency shifts to indicate proximity to obstacles. Low-pitched tones signalled distant objects, while higher-pitched tones and increasing intensity indicated closer obstructions, ensuring users could interpret spatial information efficiently.

    His work in automotive auditory interfaces examined how sound can improve situational awareness for drivers. One project involved designing warning systems for railway level crossings, where drivers might overlook visual alerts due to distraction. His team conducted experiments using different auditory cues, testing whether short, rhythmic pulses or long, sweeping alerts were more effective at conveying urgency without overwhelming the driver. Findings showed that spatialised auditory warnings, where sounds were positioned to indicate the direction of an approaching train, helped drivers respond more accurately than traditional beeping tones.

    Professor Jeon also highlighted his work on interactive sonification in public exhibitions, including the Accessible Aquarium project, which used computer vision to track fish movements and convert them into sound and music. The sound design process for this project involved defining sonic mappings that correlated with fish speed, size, and position. Large fish were assigned deep, resonant tones, while smaller fish produced higher-pitched sounds. The system was further refined by introducing dynamic panning, so the audio reflected the fish’s position within the tank, allowing visually impaired visitors to perceive their movements in real-time.

    The project was later expanded by introducing audience interaction through motion-tracking technology. Visitors could use arm movements to mimic fish, triggering musical patterns that followed their gestures. The decision to incorporate layered harmonic structures ensured that overlapping user-generated sounds remained cohesive rather than chaotic, maintaining an aesthetically pleasing experience while preserving informational clarity.

    Designing Effective Auditory Cues

    Throughout the lecture, Professor Jeon provided detailed insights into sound design decision-making, particularly in branding, interaction design, and auditory icons. In his work with LG Electronics and Samsung, he developed sound profiles for home appliances, ensuring that product sounds were both functional and emotionally resonant. His research explored how users interpret different tonal qualities and how sound frequency influences perceived urgency and pleasantness. In one experiment, he tested whether major-key melodic notifications were perceived as more friendly and reassuring than atonal, percussive alerts.

    Another innovative area of his research involved the development of lyricons (lyrics-based earcons), a novel approach where melodic speech reinforces functional commands. Instead of using generic tones, this system integrated spoken words into short musical motifs, making auditory cues more memorable. For example, turning a device on or off could be represented by a short, ascending or descending melodic phrase, rather than a simple beep. His studies demonstrated that users recalled lyricon-based auditory cues more accurately than traditional earcons, highlighting the potential of music as a tool for reinforcing interaction memory.

    In his dance-based sonification research, Professor Jeon explored how motion-capture technology can translate body movements into real-time music generation. His team designed a system where dancers wore infra-red motion sensors, allowing spatial position and gesture dynamics to control auditory parameters. The sound mappings were carefully structured so that slow, fluid movements produced soft, sustained tones, while sharp, rapid gestures triggered percussive elements. By fine-tuning these interactions, the system ensured that each performance remained expressive yet predictable, allowing dancers to intentionally shape the evolving musical landscape.

    The Future of Sonic Interaction

    Looking forward, Professor Jeon discussed how artificial intelligence, machine learning, and real-time sound generation are shaping next-generation auditory interfaces. One of his projects in this area involves music-based social robots for children with autism, where robotic agents use music to enhance social communication. The system was designed with emotion-sensitive audio cues, allowing the robot to modulate its voice and musical output based on the child’s mood. His team experimented with different musical scales and rhythmic patterns, determining that gentle, repetitive melodic structures were the most effective at capturing attention without overwhelming the child.

    His lecture provided a comprehensive and technically rich exploration of sonic information design, demonstrating how scientific principles, auditory perception, and interactive sound technologies continue to shape human-computer interaction. By combining rigorous research with creative experimentation, his work highlights the growing impact of auditory interfaces in accessibility, engagement, and multisensory experiences across multiple fields.

     

  • Andrew Spitz: Crafting Soundscapes, Interactivity, and Innovations

    In the evolving world of design and technology, Andrew Spitz’s career serves as an inspiring example of how creativity and experimentation can lead to unique and impactful innovations. From sound design to interactive media and the art of prototyping, Andrew’s journey offers insights into building meaningful user experiences through multidisciplinary approaches. Andrew Spitz shared his experiences and knowledge during an online guest lecture, offering a glimpse into his journey and expertise.

    Andrew Spitz, Frolic Studio

    The Journey: From Linear Sound Design to Interactive Media

    Andrew Spitz started his career in the world of sound design, where his primary focus was creating immersive audio experiences for films. This phase of his work was marked by linear storytelling—designing soundscapes that enhanced the narrative of visual media. For example, Andrew recorded the sounds of African wildlife to bring animated characters to life, showcasing the meticulous effort involved in capturing authentic audio.

    However, this linear approach left him yearning for more dynamic ways to engage audiences. His desire to explore interactivity led him to Edinburgh, where he delved into interactive sound design during his Master’s programme. Here, tools like Max/MSP opened new doors, allowing Andrew to experiment with dynamic soundscapes that responded to user interactions.

    This transition marked a pivotal shift in his career—from designing sounds that followed a fixed storyline to creating experiences where users could shape the narrative. It was a move from being a storyteller to an enabler, allowing audiences to co-create their journey.

    Interactive Media: Bridging Empathy and Technology

    One of Andrew’s key insights into interactive media is the importance of empathy. As an interaction designer, he emphasises the ability to step into the user’s shoes. Whether it’s designing physical installations or digital interfaces, understanding the emotional and functional needs of users drives successful designs.

    In his work with prototypes and concepts, Andrew explores how technology can evoke emotions and foster connections. For instance, a project for BMW involved recreating the exhilarating experience of walking into a packed rugby stadium, complete with crowd noise and synchronised visuals. This installation not only showcased technological prowess but also highlighted how sensory design can forge powerful emotional connections.

    Andrew also stresses that great interaction design isn’t just about logic and utility; it’s about creating delight and emotional resonance. Products that succeed are those that strike a chord with users, making them feel connected and understood.

    The Art and Impact of Prototyping

    Andrew believes that “doing is the new thinking.” Prototyping is at the heart of his creative process, enabling him to turn abstract ideas into tangible experiences. He advocates for quick, iterative prototyping as a means to test concepts, gather feedback, and refine designs efficiently.

    One of his standout projects, Paper Note, involved turning sound into physical sculptures. What began as playful experimentation with materials like cornstarch and sand evolved into a compelling visualisation of sound frequencies. This process underscores how unstructured exploration can lead to innovative applications.

    Andrew also highlights the importance of embracing imperfection during prototyping. By failing fast and cheap, designers can refine their intuition and adapt to users’ real needs. Whether building a functional prototype like *Ice Cube*, a tangible music player, or creating tools for interactive sound, the goal remains to make ideas accessible, testable, and impactful.

    Lessons from Andrew Spitz’s Journey

    Andrew Spitz’s work offers several takeaways for anyone interested in sound design, interaction design, or creative innovation:

    1. Experiment Freely: Many of Andrew’s breakthroughs came from playful experimentation with new tools and ideas. Don’t be afraid to explore without a clear goal.
    2. Embrace Empathy: Understanding the user’s perspective is key to designing experiences that resonate emotionally and functionally.
    3. Prototype Iteratively: Start small, test often, and refine based on feedback. Prototyping is as much about learning as it is about building.
    4. Merge Creativity and Technology: Use technology as a tool to tell stories, evoke emotions, and create connections, rather than as an end in itself.

    Andrew Spitz’s career illustrates the power of curiosity and creativity in pushing the boundaries of what’s possible in design and technology. His work continues to inspire by showing how sound, interaction, and prototyping can come together to craft experiences that truly engage and delight.