Interactive media, from video games and art installations, has increasing influence over our daily lives. We now spend much of our time inhabiting virtual worlds with their own rules of interaction that allow dynamic and personalised experiences. Music that supports visual or narrative content has been shown to improve how immersive these experiences feel, as well as their emotional impact and how memorable they are. But designing music for interactive media is challenging, as it is not possible to pre-compose suitable music for every situation someone finds themselves in.
One way to solve this is to model the setting the person finds themself in, and adapt music accordingly. This PhD project developed a new framework for creating adaptive music for interactive media that incorporates cognitive science, media studies, musicology and artificial intelligence. The framework uses a new approach for modelling someone’s emotion when using interactive media and cognitive models about how they make sense of the virtual world. It also uses methods for music composition inspired by improvisation techniques and how nature adapts to different environments.