DAACI Main

AI-innovators DAACI on their groundbreaking new composer-aiding technology

At the crest of a wave of increasingly astounding AI-led compositional software, DAACI has a potential for generating melody, sound and texture that is vast in scope. Now, the company quests towards a Metaverse-leaning future…

 

DAACI Main

 

As a 2020s music-maker, debating the pros and cons of AI-generated tracks has become one of our regular pastimes. From those platforms which reliably construct ready-to-go soundtracks on the fly, to those that nudge composers into certain niches, the question of whether the growing surge in computer-driven creativity is a good thing or a bad thing keeps many of us up at night. One thing that’s undeniable is that the quality of these algorithmically-designed works is getting better.

Enter DAACI; a fully-formed artificial intelligence, capable of composing, arranging, orchestrating and producing completely original music in real-time. An acronym for Definable Aleatoric Artificial Composition Intelligence, the DAACI software doesn’t rely on human-crafted samples or existing frameworks, instead forming its own musical architecture and often going above and beyond what composers are capable of.

We had a conversation with DAACI’s CEO Rachel Lyske to learn more about this intriguing software, and how DAACI might end up benefiting modern composers…

AMI: Firstly, can you give us an overview of DAACI, and how its AI-led tech is able to construct musical elements in real-time?

Rachel Lyske: The best way to answer that is to start thinking like a composer. In the compositional process, composers have options and choices for what they can do to achieve their end result. They have their defined options depending on what it is they’re trying to say. They’re not going to choose certain musical options that don’t meld well together (such as sad music during a car chase for example). So there are always many intelligent constraints over the options they choose.

So what we do at DAACI is encode that series of options based on an input, then the computer can present those options in real time to compose for whatever brief we need to fulfil. Hence we can create this dynamic and limitless music because we’re not static. That’s how DAACI works, it’s a composition brain that acts in the same way as a composer would.

AMI: How does DAACI interpret a composer’s input?

RL: If we truly understand what a brief is – especially in music – then it doesn’t really communicate a deeper meaning. What music does is that it gives you an emotional connotation. We can annotate the emotional connotations within specific music choices. And what the emotional connotations will be if we use certain cues. Consequently, when we go through a brief and someone tells us they want it to be ‘happy’ or ‘scary’ or ‘tense’, we can look at this and see how combining certain options can lead to the end result that they need.

A big part of how it works is an analysis process and another half is this meta-compositional process. So we’re helping people determine what they want to say emotionally – and how musically they can say it. Then we’re aiding them to combine from different places, to create bespoke reactions to that brief.

So the question kind of answers itself. It’s an emotional language. Straight away as a composer, you know what you need to fulfil certain briefs, and you bring certain elements together to make music that hits a certain emotional target.
 

Rachel Lyske - DAACI CEO
Rachel Lyske, DAACI CEO

 

AMI: Do you see DAACI’s unique approach to AI-based music composition as more of a system that works in tandem with the composer, as opposed to a replacement?

RL: It is very much working in tandem with the composer. DAACI isn’t a replacement for a composer, it’s an enhancement of their process. They might choose it to replace their process but it’s certainly not replacing *the composer*. In reality, most commercial composers already have heaps of options to play with, and we’re just providing a similar mechanic for everyone else. We respect that composers do that, and we’re enhancing that approach.

AMI: You’ve stressed that video game composition is a particular area where DAACI might prove to have a strong impact, why do you think this is?

RL: Well the gaming market is massive and it’s only growing. As we get closer to the Metaverse and Web 3.0 it’s only going to swell. There’s no way that smaller composers can fulfil a lot of the huge demands that writing for interactive mediums entails. With this tool you can express your intent, and DAACI will do the rest.

As an early experiment, even with just three inputs we worked out that it could generate variations of 5×10^{11}. It exceeded that actually. To put that in context, Spotify holds around 82 million tracks, and that’s 6,400 times smaller than the options available via just the three inputs we entered for our brief. So we’ll invite any gaming company who wants to explore all that with us.

AMI: Is there any other software out there similarly innovating in the field of AI-based music composition, and what marks DAACI out as different from the likes of AIVA and Amper Music?

RL: What we are excited about is that the world is opening up and the attitude towards AI and composition isn’t a terrifying prospect anymore. As individuals we’ve got around thirty years of experience at DAACI so there’s a real maturity to our approach. Getting back to that idea that DAACI is the core of the system, what we’re not doing is feeding it a load of scores and saying ‘alright, make me something that sounds kind of like that’. We’re not trying to extract from some deep neural network some kind of truth from the music. What we’re doing is saying ‘we’ve got the intent’ and we can craft a meta-composition.

It’s not a trivial thing. The majority of us are professional musicians and artists, even amongst the coders, and that’s been one of the unique aspects of us. We really think we are unique in our approach. The other approaches are only going to get you so far. I think absolutely. We’re the only ones that are doing this.

AMI: How will DAACI be rolled out then – will it be a web platform, or an app?

RL: Ultimately the system is designed to benefit the composer and there’ll be a composer tool for them to use. But that’s only the start of a productivity tree that the benefits of DAACI will flow through. The commercial applications of this brain can go into many different products, just like the end result of a piece of music written by a composer. It’s the same with DAACI. It’s about creating a new framework.

AMI: The Innovate UK investment was a high-profile advocacy of DAACI – How competitive was that process of winning funding, and can you talk about how the investment will enable you to build on the company’s objectives?

RL: We are extremely thankful to Innovate UK for that. It’s absolutely brilliant. The recognition and the support has been fantastic and it’s great that they are recognising what we’re trying to do. It was a really rigorous process. It was like doing due diligence on an investment, it was a real deep dive into everything. I think there were 1,072 applications and only 71 were funded.

The main aim was to recognise game-changing, innovative and ambitious ideas that they think will significantly impact the UK economy for good. For us, having that investment has been extremely powerful. It’s allowing us to enhance our R&D side and develop more research. We’re massively thankful to them for that and their ongoing support.

AMI: Do you think that AI-led content and art generation is going to be a massive part of our lives across the next few decades, and are we just starting to see a tidal wave of AI-led applications? Particularly as innovative ideas like the Metaverse become more widespread?

RL: I do, and I think it’s an incredibly exciting time right now and I think it’s an incredibly empowering time. DAACI is riding that crest of that wave, anticipating what the needs of the future will be. Making music intelligent in its environment, and that’s essential for the future.

AMI: What would you say to those fearful of the perceived encroachment of AI into the music composer’s marketplace?

RL: If you were asking my brother that question he might not be as polite as me. My brother is the inventor of DAACI, and it’s been built on a lifetime of research. Essentially he’s a composer, I’m a composer, and we didn’t just wake up one day and decide to do this. It’s something that’s been a lifelong obsession. We genuinely believe that the landscape of how music is created and experienced is changing. If we can enhance and empower then why wouldn’t you try to do that.

AMI: How do you see DAACI evolving further in the future, and what’s next for the company?

RL: I think it will be an integral part of composers’ workflows. Our CCO Ken often refers to a great quote from Chris Cooke (CMO Insights and Miderm, 2008), he says “The history of the music industry is basically a story about how a sequence of new technologies respectively transformed the way music is made, performed, recorded, distributed and consumed.”, essentially there’s been a series of leaps over the last 100 years, DAACI is that next phase in that evolution, particularly as the digital world starts opening up. Composers become meta-composers. The users can become composers. It can be democratised. Things don’t stop, they evolve.

For more information on DAACI, visit daaci.com