Brave new world
As DOMNIQ prepares to give the world premiere of Daniel Wohl’s Uncanny Valley with Philzuid in Eindhoven on 18 April, he explains how he collaborated with the composer as well as Artificial Intelligence, and why he thinks computers can serve musicians in the creative process
How did Daniel Wohl’s new percussion concerto Uncanny Valley come about?
I already knew Daniel’s music, because I had played a solo piece for percussion and electronics a few years ago and really loved it, as well as his quartet for percussion and electronics.
He has a totally different sound world to most other composers. He creates very unusual textures for percussion, and the way he uses electronics is very interesting. I asked him if he wanted to collaborate on a project for percussion, electronics and orchestra, and we started talking about what we wanted to do.
I liked the idea of the orchestra as the subject. There is something so human about the concept of an orchestra: 60 or 80 musicians working together, connected through their brains, but also through their feelings and emotions. That’s very special and reflects how we are as humans. On the other hand, we also have all the electronics that go into computers and machines. These are two opposites and I find this juxtaposition very interesting.
We started speaking about the role of Artificial Intelligence: whether it could help us in the creative process and live on stage, and whether we actually wanted that. We talked about it a lot and these discussions resulted in this piece, Uncanny Valley, which is about the idea of humans versus machines.
We’re very grateful to Philzuid for giving us this opportunity. It’s very rare that we get so much trust to create something completely new, so it’s great that they’re so open. Eindhoven is one of the world’s important technological hubs so it’s exciting to bring to the city – and to that audience – a project that explores electronics, and to see what AI can do in the creative process.
How did you use the Artificial Intelligence to create the piece?
In the early stages we asked AI text generators such as ChatGPT to come up with a form for the piece. We gave it our ideas and asked it, ‘What if we want to make a piece about this topic where we mix percussion with electronics and a symphony orchestra, and we have a structure where we can narrate this story?’ It came up with something: ‘First this, then that and that, and maybe you can do this, but you can also do this.’ We have no idea where it got this structure, but it actually made a lot of sense and gave us something to discuss. We didn’t include it in the composition in the end, but we took it as a starting point and it influenced our whole creative process.
We decided that there would be no AI per se in the live performance. There might be some effects, but those are not necessarily machine learning. We are not very interested in the tricks that AI can do, but it’s stimulating to be able to ask questions and broaden your scope.
What was the composition process like?
It was a very interesting collaboration – mainly online. We made a percussion setup with vibraphone, gong and a lot of junk percussion, such as tin cans and pieces of wood. I would do improvisation sessions online with Daniel and recorded all these sounds for him separately, with different sticks, mallets and ways of playing, so he could put them into his computer. He created lots of samples for me, which he used in his composition process. For example, he sent me one long synthesizer note, over which I improvised so he could see what was interesting to use.
What does the piece sound like?
There are four movements. Sometimes it is a soundscape based on textures. For example, there is a moment when I play gong with effects, so you hear a cloud of sound and then the harp starts and a cello comes in. The third movement is more beat oriented – very rhythmic, based on my improvisations on tin cans. The fourth movement feels happier, with virtuosic vibraphone notes and rapid music, with the orchestra playing very fast.
The style is a mix of electro-acoustic music and Minimalism, and you can also hear some film music influences. We’ve touched on something that hasn’t been explored much yet and is very interesting – a completely new sound.
The piece is not based on virtuosity, unlike a lot of percussion or violin concertos, which are about showing off how virtuosic the performer is. I’ve found that many percussion concertos make me run around and play a lot of difficult notes. Virtuosity can be expressive, of course, but is that something you want to listen to at home, that you would put on to enjoy? Or is it more of a performance? There’s no right or wrong, it’s just a different style of music. This piece leans more towards listening music. I told Daniel that I don’t need a piece where I’m virtuosic. I want one where I can be musical, where the sounds I produce are interesting to listen to and there is something unique that triggers special moments.
We’re trying to see how we can expand the sound world of the orchestra and percussion with electronics, so there are microphones within the orchestra and I have a solo microphone, and the sounds are processed live. For example, the flute has a microphone and at one point plays a note with huge reverb, creating a huge flute sound, which is a sound an audience would never have heard before from a symphony orchestra.
How do you see musicians working with AI in the future?
I think AI can be seen as a sparring partner or as a collaborator, but not necessarily as a decision maker. The composer or creator becomes a curator. It might result in ways that will take over some work, but on the other hand, if you learn how to collaborate with it, it can also create new opportunities.
There are always two sides: one is afraid and the other brings new things into the creative process. I’m very excited. I can give an AI engine a word that creates a sound, and then feed that sound into a synthesizer, and the synthesizer starts to unfold new patterns. Instead of me having to create this all by myself, I can have my ‘assistant’ make them on the go and I can start composing with them immediately. It’s very interesting to build your source material together with another thing, a machine.
I recently found a new AI synthesizer with singers and they sound ridiculously good. You can put in any lyrics and it ‘sings’ them – it’s very impressive. I’ve also been making a kind of driven bass rhythm which is very hard to play, while AI plug-ins can create it on the spot and make eight variations of it, instantly.
For all that, AI cannot replace humans. The results are far from the emotional aspect of how humans perform and what happens in a live situation. If you use it as a creative collaborator, though, it can be a good partner. It broadens your starting point, but in the end, it’s still humans who are the creators.
DOMNIQ performs Uncanny Vally in Eindhoven on 18 April, Antwerp on 19 April and Schouwberg on 20 April.