The Wekinator: Software for using machine learning to build real-time interactive systems
This looks very cool!
I can imagine topic maps of sounds/gestures in a number of contexts that would be very interesting.
From the website:
The Wekinator is a free software package to facilitate rapid development of and experimentation with machine learning in live music performance and other real-time domains. The Wekinator allows users to build interactive systems by demonstrating human actions and computer responses, rather than by programming.
Example applications:
- Creation of new musical instruments
- Create mappings between gesture and computer sounds. Control a drum machine using your webcam! Play Ableton using a Kinect!
- Creation of gesturally-controlled animations and games
- Control interactive visual environments like Processing or Quartz Composer, or game engines like Unity, using gestures sensed from webcam, Kinect, Arduino, etc.
- Creation of systems for gesture analysis and feedback
- Build classifiers to detect which gesture a user is performing. Use the identified gesture to control the computer or to inform the user how he’s doing.
- Creation of real-time music information retrieval and audio analysis systems
- Detect instrument, genre, pitch, rhythm, etc. of audio coming into the mic, and use this to control computer audio, visuals, etc.
- Creation of other interactive systems in which the computer responds in real-time to some action performed by a human user (or users)
- Anything that can output OSC can be used as a controller
- Anything that can be controlled by OSC can be controlled by Wekinator