Controlling sample-based synthesis with expressive gestural input

Using Conductor Follower can be briefly described as conducting a virtual orchestra. The system uses a Kinect (or any other OpenNI compatible device) to capture hand movement. This data is analyzed, and the infromation used to process score data from a MIDI file. A synthesizer is then controlled using the modified MIDI data.

The final outcome of the projet is a VST plugin, which produces a stream of MIDI events. The project was coded in C++, using OpenNI for hand tracking, and Juce for VST and MIDI related functionality. The Boost C++ libraries were also heavily utilized. The source code and documentation for the project are available online on GitHub.

Demo video

The demo video below is a screen capture, including the visualization part of the plugin. It shows the basic functionality as I conduct part of Mozart's Trio for piano, clarinet and viola in E flat major, KV 498.

Acknowledgment

The project was implemented as part of my master's thesis, and has received funding from the European Research Council under the European Community's Seventh Framework Programme (FP7/2007-2013) / ERC grant agreement no. [203636].

References

Sakari Bergen, "Conductor Follower: Controlling sample-based synthesis with expressive gestural input ", Master's Thesis, Aalto University, 2012

Sakari Bergen, The thesis presentation, made with Prezi.

 
FacebookTwitterDiggDeliciousStumbleuponGoogle BookmarksReddit