AI has the potential to actually make us more human rather than less." - ilā Kamalagharan (ILĀ)
In this film, ilā Kamalagharan (ILĀ), artist, producer, composer and member of the Icelandic band Hrím, shows us how musicians can use AI not just as a tool but as a collaborative partner.
EMMA LOUISE: ILĀ Kamalagharan is an artist, producer and composer finding new ways to use AI, not just as a tool but as a collaborative partner. We're here at the Barbican to meet ILĀ and take part in their latest exhibition.
ILĀ: We are at the Barbican Centre in an installation piece called Unbound, and we worked with Trans Voices Choir.
EMMA-LOUISE: Can you tell me more about how you used AI in this piece?
ILĀ: We actually created a data set of everyone's voices in the choir, so it only has that data in it. And that data got used to train six different models that we have used in these microphones that you see here. And when you sing into those microphones, your voice will be transformed into the voices of the choir. So it's much less generative AI and more transformative AI.
EMMA-LOUISE: So the sounds I'm hearing when I speak into the microphone—is that my voice?
ILĀ: So it's actually a mixture of your actual voice and your voice transformed into the choir's voices. So you're becoming many voices at once.
EMMA-LOUISE:What makes you excited about using AI in your music?
ILĀ: I like AI from the perspective of being able to extend what I can be and what I can do as a human. So, I think it can actually help you engage with creativity in a new way.
So we're at a really, really interesting point in history where large language models are trained on all sorts of things scraped from the internet but without the permission of the creators. What that means is that we're at a time where human creativity is in some sense, at risk.
The music I create has its roots in all sorts of different cultures and traditions, and AI itself doesn't necessarily understand that. But I think if it's trained on my work, then it will be drawing not just on me, but a whole kind of ancestral line of our shared history.
AI has the potential to actually make us more human rather than less. It's going to shape not only our lives but also our culture. I think it's a really great time to be getting into it.
ilā's journey
ilā is co-founder and director of the London Contemporary Voices choir who have performed at fashion shows for Burberry and Tommy Hilfiger. ilā has also worked with artists such as Alt-J and U2, as well as with the BBC Proms and the Harry Potter Play. The Bitesize Guide to AI team spoke to ilā to find out more about their journey to using AI tools in music production.
What inspired you to use AI?
As a kid, I used to take instruments apart just to see how they worked, then put them back together again - sometimes differently (sometimes accidentally breaking them!). That curiosity is still with me. My interest in AI stems from that same spirit of experimentation, but also from a deeper desire to understand consciousness itself. AI can feel like a mirror, reflecting our own patterns of thought - or lack thereof. I'm also creatively drawn to things that challenge or unsettle me, especially when they push me to question what it means to be human.
How did you learn about AI?
I started using computers to make music when I was about 9 or 10, and my interest in AI grew from there. As I got older, I became fascinated by questions like:
- Could a computer ever be conscious?
- What even is consciousness?
That curiosity led me to study Physics and Philosophy at the University of Bristol, which really deepened my understanding and broadened the way I think about these ideas today.
How has AI impacted your job?
Consent is everything. A big issue with many Large Language Models (LLMs) is that they’re trained on vast amounts of data scraped from the internet - often without artists’ permission. That’s something I’m strongly against.
One reason I’m drawn to Quantum Reservoir Computing is because it requires far less data to create meaningful results, making it more sustainable and more collaborative by nature. These smaller, lightweight models are also far less demanding in terms of electricity and water use, which matters to me.
In my own work, I rarely use large generative AI models. When I do, I train them on my own material. In UN/BOUND, for example, we collaborated with Neutone to train voice models using fully consensual recordings from the singers involved. Everyone collectively owns the resulting model, and together we decide how - or if - it gets used, and how contributors are compensated.
Where do you see AI going in the future?
I think AI and quantum computing have a really connected future. AI will help us build and improve quantum systems, and in return, quantum computing could give us the huge processing power we’ll need to take AI to the next level - like solving complex problems in science, medicine, and beyond.
But as exciting as all that is, we can’t ignore the ethical side. AI is already starting to change the way we live and work, so it's important we develop and use it responsibly, protect human creativity and jobs, and make sure the right safeguards are in place for everyone’s benefit.
What was your route to composing?
It all started at home. I was in my first "band" aged two with my older siblings - we called it Playskool. They’d play and shout “scream!” and my job was… well, to scream! Music has always been a part of my life. I grew up constantly singing, improvising, recording on dictaphones, phones and making up songs. It wasn’t something I formally studied at first - it just was. A natural, everyday part of being.
How the AI tool works
- Collaborating with the Trans Voices collective, part of the London Contemporary Voices choir, ilā has created an installation piece called UN/BOUND at the Barbican Centre in London.
- They used AI to create a dataset of everyone’s voices in the choir to train six different AI models. Visitors could listen to the AI choir and join in with their own voice, creating an ever-evolving experience.
Did you know?
Quantum Reservoir Computing is a new field of computing that combines the principles of quantum physics with machine learning. At a relatively early stage, it could be as accurate as machine learning models but use less memory.