BY STEPHANIE ESLAKE
Think artificial intelligence will never take over your creative work?
Here’s a project set to challenge you.
Composer Alistair McLean was interested in what this meant for him, so he initiated No New Noise – an improvisation event in which three musicians contemplate the impact of AI on their work. And you’ll get to watch.
Alistair, founder of the Australian Creative Music Ensemble, has produced game music directed by a computer brain. It doesn’t rely on human input, and it acts as the creator of the music – instructing performers in real life. It’s called Electric Sheep.
The show also features pianist-composer Joseph O’Connor, who contrasts amplification with acoustic sounds in Partial Disclosure; while trumpeter-composer Ruebin Lewis conducted research for his piece, which considers whether human work can be identified against that produced by chatbots.
No New Noise is ACME’s debut event and it’ll take place at the Melbourne Festival. We chat with Alistair about why his event is worth contemplating.
Tell us about how this project was born.
No New Noise was born out my fascination with artificial intelligence. There’s currently a lot of discussion about AI and automation replacing a wide range of jobs, but it’s often thought that more ‘creative’ roles are safe. When I reflected on the history of technology in music – initially recorded music replacing live performance, then drum machines and synthesisers replacing human musicians on recordings, and currently sampled instruments replacing whole orchestras on film scores – it suddenly felt that being a musician wasn’t a very safe place to be.
It’s a fascinating concept – having composers musically weigh in on the question of AI replacing their work. How important is it for composers and musicians to be thinking about the impact of technology, these days?
It’s become essential for composers and musicians to be aware of technology, and more and more important for them to engage and integrate it into their practice. Technology is the creation of tools and skills, so I like to think of AI as being part of a chain of discoveries we’ve made in the advancement of music, from bone flutes to pianos to self-generating computer-based music. We’ve long debated the merit of equal temperament and just intonation, but I think most would agree they both have their uses and a great modern musician is adept at both. In the same way, I hope that familiarity with technology becomes an essential part of the practice of a contemporary musician or composer.
Getting the project underway, what’s the general feeling that you’ve gauged from fellow composers about this? Is there a fear of losing your work, an excitement for new sonic possibilities; something else entirely?
For a long time, there has been excitement about the new possibilities that are on offer, but I think that is starting to be tempered with caution. It’s somewhat ironic, as the use of technology has been a huge boon for composers as it’s made musicians a little more redundant! I don’t know many composers who still write with paper and pencil; and once you’re composing with a computer notation system, you can easily mock up a great-sounding iteration of a piece with sampled instruments, saving huge amounts of time and money. But now there is a risk that the composers themselves will be made redundant through a continuation of the same process.
In your view, how real is the risk that artists will one day be made redundant by technology?
I think it’s a certainty that a lot of music and art will be replaced by technology. We’re already seeing music in video games being created by generative programs, and I believe this will increase. I see the artist’s role as commenting on society and staying ahead of the trends, and therefore something that will always be relevant. However, the more I explore the area the more I believe that true creativity from AI is not as far off as we would like to think.
Tell us about your Electric Sheep. How’d you come up with the idea for this composition project?
Electric Sheep is a piece of game music, stemming from the tradition of John Zorn’s Cobra and similar works. In Cobra, a director leads the ensemble via a set of visual cues, which force the musicians into unfamiliar situations and often leads to unexpected outcomes.
Whilst I always enjoyed playing the game, I always thought it was limited by the number of actions the director could give at once, and had dreamed of a version where each player received individualised actions at the same time. I realised this could be much more easily achieved using a computer program than a human, and then began the much trickier task of finding out if it was possible to create a computer program with an aesthetic sense.
How does this work in a physical sense?
We have a large visual score, which both the ensemble and audience can see during performance. Here, individual musical instructions are given out to the players, asking them to perform a number of tasks including accompanying or mimicking each other, playing contrasting ideas or following a given tempo. A number of modifiers can also be given, instructing the performers to play in a certain tonality, timbre or mood. As the audience can see the score, they can follow along with the gameplay and see how the commands are interpreted by the musicians. A lot of the interest – and often, humour – stems from being asked to perform tasks that are close to impossible. If you’re playing cello, how do you mimic a drumkit whilst also playing in a mournful style in Eb major?
So, if you’ve created a project that produces a computer ‘brain’, and the brain is directing the ensemble, do you feel that you are still in control of this work, anyway? How does this idea feed into your music?
I’m definitely not in control of the performances. I have the opportunity to suggest things to the computer between performances – perhaps ‘a little bit more of this’, or ‘less of this’ – but once a performance begins, the computer brain is in charge. It’s a little scary not knowing what the outcome will be, but also very liberating and it often feels like performing someone else’s work. It definitely made me question whether I can lay claim to composing the piece, or merely facilitating the creation of the piece; and whether that’s a role that composers will have to become more comfortable with as we realise the creative possibilities of AI.
What do you hope audiences will take away and muse upon during and after your show?
There’s a lot of themes in the show as we explore current uses of technology in production and alteration of recorded music, AI generation of compositions in real time, and question how we perceive intelligence in both humans and machines. I hope that audiences will leave receptive to the idea of machines developing their own creativity, and become aware this may not be as far away as we think.
Any parting words?
Perhaps it’s best not to watch Terminator, Blade Runner or The Matrix immediately before or after the show.
Watch the Australian Creative Music Ensemble present No New Noise at the Melbourne Festival, 6 October.
Images supplied.