Adaptive music for immersive experiences

Using machine learning and cognitive science to help composers adaptively score interactive media

Interactive media, from video games and art installations, has increasing influence over our daily lives. We now spend much of our time inhabiting virtual worlds with their own rules of interaction that allow dynamic and personalised experiences. Music that supports visual or narrative content has been shown to improve how immersive these experiences feel, as well as their emotional impact and how memorable they are. But designing music for interactive media is challenging, as it is not possible to pre-compose suitable music for every situation someone finds themselves in.

One way to solve this is to model the setting the person finds themself in, and adapt music accordingly. This PhD project developed a new framework for creating adaptive music for interactive media that incorporates cognitive science, media studies, musicology and artificial intelligence. The framework uses a new approach for modelling someone’s emotion when using interactive media and cognitive models about how they make sense of the virtual world. It also uses methods for music composition inspired by improvisation techniques and how nature adapts to different environments.

Patrick's Adaptive Music System in action

The framework was used in an adaptive system for games that generates music scores in real-time based on events that are happening in the world of the game. Compared to the official soundtrack for the games, it was found that this system better matched the music with events, and increased feelings of immersion.

These results support the framework as a practical tool in scoring interactive media products as well as the contributions it makes to theoretical knowledge in this area.

Project members

Patrick Hutchings
Jon McCormack (PhD Supervisor)
Vince Dziekan (PhD Supervisor)

 

Outcomes

Hutchings, P. (2017, May). Talking Drums: Generating drum grooves with neural networks. arXiv preprint arXiv:1706.09558. International Workshop on Deep Learning in Music (IJCNN2017).

Hutchings, P., & McCormack, J. (2017, April). Using Autonomous Agents to Improvise Music Compositions in Real-Time. In International Conference on Evolutionary and Biologically Inspired Music and Art (pp. 114- 127). Springer, Cham.

Hutchings, P., & J. McCormack: “Adaptive Music Composition for Games“, IEEE Transactions on Games, June 2019,DOI: 10.1109/TG.2019.2921979