Submitted for ECS7022P Computational Creativity at Queen Mary University of London
The main.ipynb file contains the majority of this projects implementation, including an interface developed as a playground for exprimenting with the system, as well as the code used to train it. Three other files contain some class and function definitions, split to clean up the main notebook. These are packaged as a python project (see the pyproject.toml file) and include:
tokeniser.py- Implementations of a number of tokenisation methods which were experimented with (onlyDrumSequenceEncoderis used in the final system)modules.py- Implementation of the custom transformer layers which include a relative position encoding and associated attention blocks.utils.py- Helper methods for processing, displaying and playing MIDI data.
Below are some example outputs of the system, paired with the input patterns which were used as prompts.
Original: backbeat_original.wav
Subtle Variation: backbeat_minimal.wav
Pocket Beat: backbeat_tight.wav
Lots of shifts: backbeat_shifts.wav
Dense sampling: backbeat_dense.wav
Original: amen_original
Shifting: amen_shifts
Breakbeats: amen_dense
Original: be_my_baby_original
Syncopated: be_my_baby_syncopated
Improvised: be_my_baby_improvised