The music hacker team Dadabots used RNN – a Recurrent Neural Network – to create an endless bass solo that is streamed live to YouTube.
It was based on compositions performed by the famous bass player Adam Nelly. Dadabots passed the recording through a neural sound generation model – SampleRNN, which is able to analyze musical fragments of a work and, based on this, predict which sounds should follow next.
The first attempt turned out to be too noisy, after which the team edited the original data set and reduced the sampling rate. As a result, we managed to reduce the noise level, which helped the neural network to work with longer patterns. In the process of training the AI, Nelly’s two-hour bass improvisations were used.
The Dadabots project demonstrates how AI in music is transformed from an entertaining gimmick into a powerful creative tool with the prospect of creating an endless, yet original music flow. The Dadabots team has already generated melodies representing a fusion of funk and progressive rock, the volume of which was enough for 13 albums.