This DEMO shows how music (or any type of sound) can be controlled with gestures on a standard web page. It is built with a library called MediaPipe - https://google.github.io/mediapipe/ for tracking a body from a video signal.

The X, Y, and Z locations of all tracked points are then fed into an adaptive mixer built with WebAudioXML - https://github.com/hanslindetorp/WebAudioXML Both technologies are freely available for use on any web page. The inspiration for this DEMO comes from Handmate-MIDI.

This application is part of a research project at the Royal College of Music and the Royal Institute of Technology in Stockholm by Hans Lindetorp. Please follow my blog at hans.arapoviclindetorp.se

Music
Close Info

Gesture Controlled Music - hans.lindetorp@kmh.se