đŸĒ˜Swipe Drum

Swiping Rolls with the help of ML Models

The goal for this project was to make a drum controller the allowed you to play without lifting your hand much. Also, I wanted to be able to provide access to alternate techniques for hi-hat rolls and double kick hits, which would be hard to do using touch and your fingers without using a beat repeat effect. A beat repeat FX would not be as flexible as it is usually locked to a specific tempo and note division. My implementation would allow for tempo variation due to the real-time sensor readings and gestures.

I first created a model training template using an ml.svm object. I used a radio button group and gate object to allow for easy switching of training classes and switching to map mode. I then connected a mira.multitouch to use my iPad to send XY touch position sensor data. Once the model template was tested and ready for use, I duplicated it a few times to setup different controllers. I trained each model with gestures that I thought would make the flow of my alternate roll techniques smoothly connect to a snare rim hit. I retrained the model a few times until the feeling was just right. I lastly sent all audio from the drum sounds and piano chords to an FX plugin. The FX model was trained to make blending through different effect types smooth, also like drawing a curve. Then I created a few signifiers for the UI of the miraframe.

Last updated