Day 3 - Mapping gestures to sounds from demonstrations

Photography from Corpus Nil from the artist Marco Donnarumma (IT/DE). The artist is using muscle contraction sensors to generate sounds. Link to the video: Corpus Nil

Table of contents

  1. Introduction
  2. Content of the repository
  3. Assigment
  4. Installation required
    1. Plugdata
    2. ml-lib
    3. Streaming data from your phone

Introduction

The third day of this class will show you:

  • A novel application of machine learning in culture and art: mapping gestures to sounds from demonstrations.
  • A novel graphical programming environment to develop interactive prototypes: PureData

Your goal will be to train a machine learning model to generate sounds from gestures. The training data will be collected by you, on spot, using the sensors from your smartphone.

Please download the code and data from the github repository and follow the instructions in the A3_gesture_to_sound_mapping.

Github repository of the course

Content of the repository

The repository contains the following folders:

  • puredata_cheatsheet.pd: A patch with the main objects seen during the course and their keyboard shortcuts.
  • dub_siren_box: A practical and fun exercise to start apply the concepts seen before. The patch reimplement this hardware.
  • mapping_by_demonstration: Demonstration of how to map gestures to sound using the ml-lib library.

Assigment

Implement a musical instrument using Plugdata and ml-lib. The instrument can map any signal to any sound or sound effect you’d like. You are then assigned to upload the files along with a short video of your instrument in action !

Installation required

Plugdata

Plugdata is a free/open-source visual programming environment based on pure-data. It is available for a wide range of operating systems. Please install plugdata for your operating system from the plugdata website.

ml-lib

ml-lib is a library of machine learning externals for Max and Pure Data designed and developed by Ali Momeni and Jamie Bullock.

The goal of ml-lib is to provide a simple, consistent interface to a wide range of machine learning techniques in Max and Pure Data. The canonical NIME 2015 paper on ml-lib can be found here.

Full class documentation can be found here.

Please download the latest release of ml-lib for your operating system from the ml-lib website. Then copy the files into the PureData folder.

  • On Windows, you can copy the files into the folder C:\Program Files\plugdata\Extra\ml.lib
  • On MacOS, you can copy the files into the folder Documents/plugdata/Extra/ml.lib

Streaming data from your phone

To stream sensors’ data from your phone via the protocol OSC to your computer, you can use the following open-source applications:

  • On Android: Sensors2OSC can be installed via F-Droid
  • On iPhone: Data OSC can be installed via the App Store

Both applications are free, open-source, and developed by independent developers. For this reason, it is possible that they trigger security alerts on your phone.


Back to top

Logo HM Logo MUC.DAI Logo HMTM Logo Wavelab Logo BIDT