Tag: Foley

  • Reimagining Sound in Live Theatre II

    Part 2: Engineering Actor-Controlled Sound Effects with IoS Devices

    This post builds on insights from Part 1 of our series on interactive theatre sound design. If you haven’t read it yet, check out Part 1: Collaborative Sound Design in Theatre.


    Rethinking Sound Control in Theatre

    Traditional theatre sound design relies heavily on off-stage operators using software like QLab to trigger pre-recorded cues. While reliable, this model limits spontaneity and performer agency. This research investigates a shift: giving actors direct control over sound effects using networked Internet of Sound (IoS) devices embedded in props or costumes.


    What Is the Internet of Sound?

    The Internet of Sound (IoS) is a subdomain of the Internet of Things (IoT), focused on transmitting and manipulating sound-related data over wireless networks. It includes:

    • IoMusT (Internet of Musical Things): Smart instruments with embedded electronics.
    • IoAuT (Internet of Audio Things): Distributed audio systems for production, reception, and analysis.

    This project leans toward the IoMusT domain, emphasizing performer interaction with sound-generating devices.


    Technical Architecture

    The workshop deployed 9 IoS devices built with Arduino MKR 1010 microcontrollers, chosen for their built-in Wi-Fi and affordability. Each device communicated via Open Sound Control (OSC) over UDP, sending sensor data to Pure Data patches running on local laptops.

    Sensors Used:

    • Accelerometers – for dynamic control (e.g., storm intensity)
    • Force Sensitive Resistors (FSRs) – for pressure-based triggers
    • Circular and Rotary Potentiometers – for pitch and volume control
    • Photoresistors – for light-triggered samples
    • Buttons – for simple cue activation

    Each performance space had its own router, enabling modular and fault-tolerant deployment.

    Hardware set up in the main theatre space.
    Hardware set up in the main theatre space.

     

    Hardware setup in the rehearsal space.
    Hardware setup in the rehearsal space.

    Interaction Design

    Participants interacted with both pre-recorded samples and procedural audio models:

    Pre-recorded Samples:

    • Triggered via buttons, light sensors, or rotary knobs
    • Used for audience reactions, chorus sounds, and character cues

    Procedural Audio Models:

    • Spark – Triggered by button (gain envelope)
    • Squeaky Duck – Controlled by FSR (pitch modulation)
    • Theremin – Controlled by circular potentiometer (oscillator frequency)
    • Stormstick – Controlled by accelerometer (rain and thunder intensity)

    These models allowed for expressive, real-time manipulation of sound, enhancing immersion and authenticity.

    Circular potentiometer used to control a Theremin type sound effect.
    Circular potentiometer used to control a Theremin type sound effect.

     

    An accelerometer within the 'Stormstick' controls the gain of a rain synthesis model and the trigger rate of a thunder one.
    An accelerometer within the ‘Stormstick’ controls the gain of a rain synthesis model and the trigger rate of a thunder one.

    Participant Feedback & Findings

    Benefits:

    • Enhanced Timing – Actor-triggered cues improved synchronisation
    • Creative Freedom – Enabled improvisation and dynamic adaptation
    • Authenticity – Increased believability and audience engagement
    • Actor Agency – Encouraged deeper integration into the production process

    Challenges:

    • Reliability – Wi-Fi dropouts and device failures were noted
    • Cognitive Load – Actors expressed concern over added responsibilities
    • Integration – Costume and prop design must accommodate sensors
    • Audience Distraction – Poorly integrated devices could break immersion

    Engineering Considerations

    To ensure successful deployment in live theatre:

    • Robust Wi-Fi – Site-specific testing and fallback systems (e.g., QLab) are essential
    • Thermal Management – Embedded devices must remain cool and accessible
    • Modular Design – Quick-release enclosures and reusable components improve sustainability
    • Cross-Department Collaboration – Early involvement of costume, prop, and production teams is critical

    Sound Design Strategy

    Sound designers must consider:

    • Spot vs. Atmosphere – One-off effects may suit samples; dynamic ambiences benefit from procedural audio
    • Sensor Mapping – Choose intuitive controls (e.g., FSR for pressure-based sounds)
    • Actor Suitability – Confident performers are better candidates for device control
    • Rehearsal Integration – Early adoption helps reduce cognitive load and improve fluency

    Future Directions

    The next phase involves deploying IoS devices in a live pantomime performance in December 2025. Beyond this, distributed performances across locations (e.g., London and New York) could leverage IoS for synchronised, remote interaction.

    Exploration of alternative microcontrollers (e.g., Teensy) and operating systems (e.g., Elk Audio OS) may improve scalability and reliability.


    Conclusion

    Actor-controlled IoS devices represent a promising evolution in theatre sound design—merging technical innovation with artistic expression. While challenges remain, the potential for more immersive, responsive, and collaborative performances is clear.

  • Reimagining Sound in Live Theatre

    Part 1: Collaborative Sound Design in Theatre – A Workshop Approach

    In an age where immersive experiences are reshaping the boundaries of performance, sound design in theatre is undergoing a quiet revolution. A recent workshop held at The Dibble Tree Theatre in Carnoustie explored this transformation, bringing together actors, sound designers, and experimental technologies to co-create a new kind of theatrical soundscape.

    Two pantomime characters
    The Dame and the Barron, ready to collaborate with our sound designers!

    Why Sound Design Needs a Shake-Up

    Despite its central role in storytelling, sound design in theatre has lagged behind lighting and projection in terms of innovation. Traditional tools like QLab remain industry staples, but they often limit sound to pre-programmed cues triggered by operators. This workshop challenged that model by asking: What if actors could control their own sound effects live on stage?


    Collaboration at the Core

    The workshop was designed as a playful, hands-on experience. Participants—ranging from amateur theatre enthusiasts to experienced backstage crew—worked in small groups to rehearse and perform short pantomime scenes. They used Foley props (slide whistles, rain sticks, thunder tubes), pre-recorded samples, and procedural audio models to sketch out their sound designs.

    Importantly, actors and sound designers collaborated from the outset, rehearsing together and experimenting with timing, mood, and interaction. This flattened hierarchy fostered creativity and mutual learning.

    A character and a sound designer
    Long John Silver performing his actions along with a sound designer on a slide whistle

    Enter the Internet of Sounds

    A standout feature of the workshop was the use of networked sound devices—custom-built tools powered by Arduino MKR 1010 boards and Pure Data software. These devices allowed actors to trigger sounds via sensors embedded in props or wearable tech. For example:

    • A motion sensor in a prop triggered audience reactions.
    • A rotary knob controlled volume and playback of samples.
    • An accelerometer and force-sensitive resistor enabled real-time manipulation of procedural audio.

    These embodied interfaces blurred the line between performer and sound operator, creating a more organic and responsive soundscape.

    Sound designer studying the script
    Sound designer studying the script with the Internet of Sound devices beside him.
    Sound designer performing
    Sound designer performing the sounds on the Internet of Sound devices, with script on other hand and watching the stage to get her timing right.

    What Participants Learned

    Feedback was overwhelmingly positive. Participants reported:

    • Greater appreciation for the complexity of sound design.
    • Enjoyment of the collaborative and playful structure.
    • Insights into how sound design principles transfer to other media like film and radio.

    Challenges included cognitive load—especially for actors managing props, cues, and performance simultaneously—and occasional technical glitches with Wi-Fi connectivity.


    Key Takeaways

    • Actor-led sound triggering offers better timing and authenticity.
    • Early integration of sound design into rehearsals is crucial.
    • Embodied interaction (e.g., using props or wearables) enhances engagement.
    • Collaboration between departments—sound, props, costumes—is essential for success.

    Final Thought

    This workshop offered a fresh perspective on how sound can be more deeply integrated into live theatre. By inviting collaboration between actors and sound designers and experimenting with interactive technologies, it opened up new possibilities for creative expression. While challenges like reliability and cognitive load remain, the enthusiasm and insights from participants suggest that actor-led sound design is a promising direction worth exploring further.


    In Part 2, we explore the technical implementation of actor-controlled sound effects using Internet of Sound (IoS) devices. Stay tuned for a deeper dive into the engineering behind the performance.

  • Theatre Sound Design Workshop

    Date: Saturday 3rd May 2025
    Location: Dibble Tree Theatre, 99A High St, Carnoustie DD7 7EA
    Morning workshop: 09:30 – 12:30
    Afternoon workshop: 13:30 – 16:30
    Workshop Duration: 3 hours
    Tea, coffee and biscuits provided
    Suitable for individuals 16-years and over

    Step into the enchanting world of theatre with our exciting project! We’re opening the doors to everyone in our community, giving you a backstage pass to the magic of live performances. Dive into the fascinating realm of sound effects (SFX) and team up with actors to create a one-of-a-kind performance.

    This workshop is a unique adventure, perfect for anyone curious about the hidden roles in theatre or looking to boost their confidence. You’ll see how SFX can transform and elevate the emotions in a performance right before your eyes.

    Unlike typical arts-based workshops, where you collaborate with an artist, this experience lets you interact with sound-producing objects and sensors to craft unique sonic experiences. Work alongside actors to bring a unique performance to life, exploring everything from classic devices to cutting-edge synthesis models. Plus, you’ll discover how STEM skills can be applied to various media. It’s a fun, hands-on journey into the heart of theatre!

    Booking link for morning session (09:30 – 12:30): https://calendly.com/r-selfridge-napier/theatre-sounds-workshop-morning

    Booking link for afternoon session (13:30 – 16:30): https://calendly.com/r-selfridge-napier/theatre-sounds-workshop-afternoon

    Any issues or questions, please contact Rod Selfridge (r.selfridge@napier.ac.uk).