New interfaces for musical expression

I have had a long standing interest in developing new interfaces for musical expression since 1994, when I was first immersed in the world of software synthesis during an internship at CCRMA. My goal has always been to find ways to capture the power of sound synthesis on the computer in a way that is accessible yet powerful.

Interfacing with sensors

In order to build an instrument with a computer, you first need a method of getting data from the human into the computer. This inevitably involves sensors, whether they are HIDs like mice/joysticks/gamepads/etc., raw sensors like accellerometers/FSRs/range finders/etc., or even video cameras and microphones.


Once you have some data and you have some method of synthesis you want to control (be it audio, video, smell, touch, taste, whatever), now you need to map your input data to your controls. Feedback always makes the interaction with an instrument richer, and that needs to be mapped as well. I am working on encapsulating mapping ideas from the current literature into a self-contained library of mapping primitives. The aim is to change mapping into an exercise of playing, trial-n-error, logic, instead of the heavy math that is current required.


haptics for human-computer interfaces I am using "haptic" joysticks and mice to experiment with using haptic feedback in musical instruments. The joysticks are capable of forces, damping, and vibration. The mice are capable of pulses and vibrations.

Related links

$Id: index.html,v 1.7 2008-03-18 01:02:09 hans Exp $