Started in 1995, JoyStickMusicMachine was my college thesis, a NeXTSTEP MusicKit app to control sound in realtime using joysticks, all programmable via a GUI. Since then, many versions of this same idea have been implemented. I have since picked up on the idea again, this time building it from scratch using Pd and adding a mouse. Both the mouse and joystick provide haptic feedback to enable finer control of the music.
The sources to the old NeXTSTEP StickMusic is currently up on SourceForge.
StickMusic on SourceForgeIn using computer to synthesize, edit, and process sound, I realized that computers have given us the power to create any sounds we can imagine and also, in the process, one might stumble upon a sound the like of has never been heard before; this has created a new realm of music. But this power comes at a great price. Working with sound on computers is usually not interactive in realtime and generally far from intuitive. My aim was to create an instrument that can harness the power of computer synthesis yet have it controlled in a manner that is familiar to a wide variety of people. I decided joysticks would be a good interface because they are something that are widely understood, simple in their concept, and control more than one dimension with one hand. In order to link the joysticks to the sound-producing algorithms within the computer, this program takes the output from the joystick devices (buttons, dimensions of the stick) and links them to Parameter objects which in turn are linked to sound-producing Instrument objects. The Parameter objects take their input from a single joystick dimension and set up a MusicKit Parameter. The instruments also take input from WaveTables and Envelopes. It was my goal to keep the architecture open enough to enable people to explore a wide range of possibilities within MusicKit Instruments.
$Id: index.html,v 1.6 2006/11/26 23:26:26 hans Exp $