Adaptive Music Composition for Games

@article{Hutchings2019AdaptiveMC,
  title={Adaptive Music Composition for Games},
  author={Patrick Hutchings and Jon Mccormack},
  journal={IEEE Transactions on Games},
  year={2019},
  volume={12},
  pages={270-280}
}
The generation of music that adapts dynamically to content and actions has an important role in building more immersive, memorable, and emotive game experiences. To date, the development of adaptive music systems (AMSs) for video games is limited both by the nature of algorithms used for real-time music generation and the limited modeling of player action, game-world context, and emotion in current games. We propose that these issues must be addressed in tandem for the quality and flexibility… 

Figures and Tables from this paper

An adaptive music generation architecture for games based on the deep learning Transformer model

This paper presents an architecture for generating music for video games based on the Transformer deep learning model. Our motivation is to be able to customize the generation according to the taste

An Analysis of Repetition in Video Game Music

Video game music, unlike other forms and genres of music, is comparatively young in its development, and undoubtedly heavily influenced by the hardware it was originally played on. This study

Neo-Riemannian Theory for Generative Film and Videogame Music

Music is an essential element of films and videogames, which strongly contributes towards an immersive experience, by establishing a setting and mood, enhancing the storyline, and developing

Feasibility of Music Composition Using Deep Learning-Based Quality Classification Models

  • Shuang Zhang
  • Computer Science
    Wireless Communications and Mobile Computing
  • 2022
A deep learning-based quality classification model for music composition feasibility is used and the experimental results show that the algorithm has the advantages of fast detection speed and high quality.

Procedural Content Generation through Quality Diversity

In the last few years, a handful of applications of QD to procedural content generation and game playing have been proposed; this work discusses these and proposes challenges for future work.

Minimizing the Total Tardiness of a Game Project Considering the Overlap Effect

This study proposes two scheduling algorithms to reduce the total tardiness of a game project; if the problem size is small, a branch-and-bound algorithm is employed to provide the optimal schedules; otherwise a genetic algorithm is used to generate near-optimal schedules.

References

SHOWING 1-10 OF 60 REFERENCES

The soundtrack of your mind: mind music - adaptive audio for game characters

An experimental application for individualized adaptive music for games using an affective model that can be integrated to player characters to increase believability and research into the area of believable agents is described.

Towards an Emotion-Driven Adaptive System for Video Game Music

This research aims to create an audio system for video games that improves the experience by adapting environmental music to emotions associated with the ongoing narrative, using player behavior and emerging feelings as main cues.

AudioInSpace: Exploring the Creative Fusion of Generative Audio, Visuals and Gameplay

A multifaceted procedural content generation (PCG) approach that is based on the interactive evolution of multiple artificial neural networks that orchestrate the generation of visuals, audio and gameplay is introduced.

Game Sound Technology and Player Interaction: Concepts and Developments

This chapter treats computer game playing as an affective activity, largely guided by the audio-visual aesthetics of game content (of which, here, we concentrate on the role of sound) and the

Audioverdrive: Exploring Bidirectional Communication between Music and gameplay

An arcadestyle 2d side-scrolling game, Audioverdrive, is discussed, demonstrating an integrated music/game composition approach, and how game-design decisions feed back into the music-composition process.

Affective evolutionary music composition with MetaCompose

The results of these studies demonstrate that each part of the generation system improves the perceived quality of the music produced, and how valence expression via dissonance produces the perceived affective state.

Evaluating musical foreshadowing of videogame narrative experiences

We experiment with mood-expressing, procedurally generated music for narrative foreshadowing in videogames, investigating the relationship between music and the player's experience of narrative

Using Autonomous Agents to Improvise Music Compositions in Real-Time

This paper outlines an approach to real-time music generation using melody and harmony focused agents in a process inspired by jazz improvisation and found that implementing embedded spaces in the LSTM encoding process resulted in significant improvements to chord sequence learning.

Procedural Content Generation in Games

This book presents the most up-to-date coverage of procedural content generation (PCG) for games, specifically the procedural generation of levels, landscapes, items, rules, quests, or other types of

Time perception, immersion and music in videogames

It is found that the addition of music does alter time perception but only in one paradigm and suggests that music could be an important factor in the perception of time whilst playing videogames.
...