Generative Approaches and Feedback Systems in Sound Design for Macbeth

Based on The Tragedy of Macbeth by William Shakespeare
Premiere: May 28, 2021 | Große Bühne, Tickets: Volkstheater München

Official Trailer


The Tragedy of Macbeth, the story about the murderous relentless ruler trying to gain power about the future had its first premier in 1606 and is unfortunately still part of the world we live in. Constantly new versions of a man feeding from power emerge. They only have to be interpreted, shaped or understood accordingly. Similar to the initial prophecy of the Three Wicked Sisters which leads our protagonist to put faith into the path he chose. Even though its bloody end is already revealed. This is just a metaphor for presidents and dictators of our time. Translating this message into creating the sound design for the play I got inspired by Christoph Cox’s idea of Sonic Flux, which understands sound to be a continuous material flow to which human expressions contribute but which precedes and exceeds those expressions.
By using signals of non-representational meaning (at least in the context of theater) as input for four different processes I wanted to shape those sounds into meaningful entities, which interconnect with the visual layer of the play. On this page I will present the acoustical systems and processes I used to compose music for the piece and will discuss the experiences I had applying those techniques in a theatrical context.
The uploaded audio files are single stems and simple recordings of these processes. To hear them in context I highly recommend to visit the show – hit me up if you need a ticket – or request a guided tour through the Ableton live set.

First notes and research

Cymbal Feedback Loops

The Tragedy of Macbeth immediately reminded me of feedback loops. The amplification of a signal by sending its output back into the input until it eventually becomes an unbearable harsh noise. To use this concept of feedback in a musical context, it is necessary to control the amplification inside of the feedback loop. The paper about The Bistable Resonator Cymbal by Andrew Piepenbrink and Matthew Wright gave the initial idea to send audio through a transducer (vibration speaker) into a Cymbal, pick up the sound of the vibrating metal with a microphone and feed it back into the transducer again. Sending audio into a physical object made it possible to interact with the sound in various ways. For example by touching and beating the cymbals or simply changing the microphone positions. Different sizes of cymbals had different resonance frequencies but the results where mostly drones with a rich harmonic spectrum.
It was also possible to use slow and complex waveforms to play the cymbal rhythmically.


Signal flow chart #1

Oil Tank Echo Chamber

For this process I utilized a 2m³ oil tank inside the basement of the house I live in.
During a construction site on the ground floor I had access to the basement for two evenings, before the tank was removed by the construction workers. By sending audio through a transducer located on top of the tank, the metal construction transformed into a resonating echo chamber.
I mainly used simple sequences of waveforms as well as noise, to experience the shaping capabilities of this physical signal path. The microphone was hanging inside the tank and fed the signal into an audio interface which sent the audio back into the transducer again. Due to the tanks resonance properties it was possible to create dub-like chords and eerie soundscapes by slapping or hammering the metal.


Signal flow chart #2

Feedback Loops for Signal Generators

Preset sheets I-III

Simple waveforms as starting point for a feedback path with optional return points of the processed or unprocessed signal. To shape and process the signal I used various guitar pedals, filters and amplifiers. The output of this feedback setup is quite drastic and intense. Adding a noise gate and limiter inside the DAW (Digital Audio Workstation) prevented the loop from becoming total overload. Those recordings where used in a scene where the brutal downwards spiral of Macbeth begins.
To be able to recall those feedback loops I created various preset sheets, but due to the large amount of parameters it is very difficult to recall exact the same settings again.


Signal flow chart #3

Four Band Resonance

Feeding simple white noise into four independent equalizers with very narrow bandwidth made it possible to amplify a certain frequency spectrum of the constantly changing input signal.
By changing the resonance frequencies of each band it was possible to create choir-like atmospheres. Further processing was added by a delay and reverb.


Signal flow chart #3


Final Ableton live set

Using generative approaches and feedback loops for theater was an insightful experience and helped a lot to shape a unique acoustic experience for the play. Since the signal got processed by various physical variables – like material and space – frequencies outside the European standard twelve tone tuning became more dominant. This made it exiting to combine those recordings with intonation tuning or atonal recordings.
One of the biggest challenge in working with continuous sound was to keep track of the huge amount of recorded audio and to find the right parts for specific energy levels of the play. Especially in theater moods and energies can shift from rehearsal to rehearsal- even after the premier, when a finished project file has to be handed over the theater‘s technician. It might be different in a improvisational context or when playing those processes live on stage, but it was not possible to rely entirely on the output of those systems. It was also difficult to recreate those signal chains on different locations (studio, rehearsal stage, theater accommodation), since the setup of those systems is quite tedious and dependent on physical variables which are changing with every new location.
I ended up using a hybrid setup by preparing as much as possible material in advance or in the early stages of rehearsals, just by thinking myself into the play. Many recordings ended up not being used at all, since they simply did not fit in. All other recordings where tested at the final stage of the rehearsals and missing sections where produced on site with various portable synthesizers. The idea of the director and me was to create a cinema like experience with a constant soundscape.
Summing up, I really enjoyed approaching sound creation for a specific task a lot and it opened up new theoretical directions of a deeper, multidimensional understanding of sound and its perception. This approach might not work with every form of text and play, but especially for Macbeth it was great to communicate the headspace and brutality sonically, instead of flooding the stage with blood or using props. The stage designer’s decision to use only two metal scaffolds as backdrop made it possible to let sound become the actual setting for this evening.

Work in progress…


Director: Phillip Arnold
Playwright: Rose Reiter
Cast: Jakob Immervoll, Anne Stein, Henriette Nagel, Jan Meeno Jürgens, Jonathan Müller, Max Poerting
Stage design: Viktor Reim
Costume design: Julia Dietrich
Music: Adel Akram Alameddine


The Bistable Resonator Cymbal by Andrew Piepenbrink and Matthew Wright

Sonic Flux by Christoph Cox