Here is a nice video showing how Kinect from Microsoft really can have a play in music production:
A brief video showing Tension. An interactive spatial sound installation for multiple users. A person enters the space and a generative sound is assigned to that person. The sound pans around in the 6-channel speaker system following the user in the space.
Up to 5 users can use the installation at the same time. Each person modifies the other sounds based on the distance to the other users. The closer you are to other people the more the tension in the sound increases.
This is the configuration of Synapse Kinect hack Ableton
The right hand Y (LFO) and Z (Tune) Bass Virus TI patch Raw and the left hand stutter iZotope plug combined with the right hand ..
Visuals by Quartz Composer synapse hack ..
Synapse is an app for Mac and Windows that allows you to easily use your Kinect to control Ableton Live, Quartz Composer, Max/MSP/Jitter, and any other application that can receive OSC events. It sends joint positions and hit events via OSC, and also sends the depth image into Quartz Composer. In a way, this allows you to use your whole body as an instrument.
Check out this video for an explanation of how this is done: http://www.youtube.com/watch?v=teHCHsjxI00
This is an example performance of what you can make with the Synapse for Kinect tools combined with Ableton Live and Quartz Composer. You can have this running on your computer within minutes! Head to http://synapsekinect.tumblr.com to download.
Tim Thompson, caught at the West Coast Controllerism Championship, discusses his unique Kinect music controller, the Space Palette.
Thompson’s MultiMultiTouchTouch is built with these components:
- Microsoft Kinect (with power supply)
- KeyKit which is Tim’s own programming language and graphical environment for MIDI
- Cinder Open Source SDK
- Open Source Computer Vision (OpenCV)
- Open Sound Control (OSC)
- Python, used to implement a GUI for parameter control
- HP Laptop running Windows 7
- A wood frame for calibration (wood frame is not required once the frames have been calibrated).
The raw output of this controller is OSC messages formatted using the TUIO (multitouch) standard format. Parameters of the software can be controlled with JSON-formatted messages.
You can get more info on Thompson and his unique controllers at his site.
On Rainlith, the primitive naturally granular sound of a big rainstick gets explored in real-time by cyber-age sound manipulation tools.
It’s an interactive piece in witch the movement of the audience’s body activates an electric motor, making a reflex movement on the structure that embraces the instrument.
The sound of the rainstick is captured and processed in realtime, and sent 24 meters above, filling the empty space of a old industrial cereal container. The reverberated acoustic mix is then received back by the audience in the spot right below the opening of the container.
H-bridge (hand made)
24v 6A DC motor
FM emitter / receiver
ion ipa3 portable speaker
Max for Live
This video is a must see, really cool performance using Kinect technology.
Extending the acoustic vibraphone by embedding 3D gesture recognition using the Kinect. Kinect-Max/MSP (Kinect Datastream-OSC-MIDI)- controlling filter parameters in Ableton Live to modulate the acoustic audio input from the vibraphone in real-time. Developed by Gabrielle Odowichuk and Shawn Trail @ MISTIC.
Kinect for Xbox 360, or simply Kinect (originally known by the code name Project Natal), is a “controller-free gaming and entertainment experience” by Microsoft for the Xbox 360 video game platform. Based around a webcam-style add-on peripheral for the Xbox 360 console, it enables users to control and interact with the Xbox 360 without the need to touch a game controller, through a natural user interface using gestures and spoken commands. The project is aimed at broadening the Xbox 360’s audience beyond its typical gamer base. Kinect competes with the Wii Remote Plus and PlayStation Move and PlayStation Eye motion control systems for the Wii and PlayStation 3 home consoles, respectively.
In this video:
Matt Davis hacks a Kinect using OpenNI & Max/MSP. With it mapped to Ableton live and Henry Strange’s MIDI to DMX Laser Control System, Matt demonstrates this fun a/v control system.
A Reaktor Ensemble for visualizing Kinect skeletal data available at geometricmusic.co.uk. OSCKinnection feeds the skeletal data into Reaktor using OSC. The yellow orb is controlled by the two hands and the orb position controls a synthesizer. All joint positions can be used as synth inputs.
At our last hacKinect Wednesdays gathering, we worked on the physical side of the installation. We bought a bunch of different types of solenoids (push, pull, big, small) to see how they work and which ones would suit best our installation.
We also wanted to see a prototype of the whole system in action. The system goes Kinect > KinectCoreVision > Max/MSP > Arduino > Solenoids.
We still have a long way to go, but it’s good to see the prototype in action.
For more info, please visit soundplusdesign.com
Via Andrew Spitz
Short video of a telekinetic conductor, hand positions and orientation are tracked with a Kinect and 2 Wiimotes.
The conductor controls multiple orbs in 2D space using hand movements to attract the different orbs; thus using a virtual telekinetic force the user can control these orbs. The position of these orbs then gets converted into midi signals that can be used to control a synthesizer. In the video attached a midi track is being played into NI’s Massive synth and all 8 dials on the synthesizer are being controlled at once, using different controllers it is quite possible to control up to 20 parameters at once.
Being able to control hundreds of parameters at once is all well and good; however this device adds something extra. When someone uses this device they automatically find themselves dancing or conducting (dependent on the music type) in order to control the music properly, in fact if you don’t dance you don’t get the right timing.
The functionality is not limited to one machine either; each component connects to a server via TCP/IP so collaboration / jamming can take place over the internet.
Perhaps, check this fascinating video out
Steerable AutoStereo 3-D Display: We use a special, flat optical lens (Wedge) behind an LCD monitor to direct a narrow beam of light into each of a viewer’s eyes. By using a Kinect head tracker, the user’s relation to the display is tracked, and thereby, the prototype is able to steer that narrow beam to the user. The combination creates a 3-D image that is steered to the viewer without the need for glasses or holding your head in place.
Steerable Multiview Display: The same optical system used in the 3-D system, Wedge behind an LCD, is used to steer two separate images to two separate people rather than two separate eyes, as in the 3-D case. Using a Kinect head tracker, we find and track multiple viewers and send each viewer his or her own unique image. Therefore, two people can be looking at the same display but see two completely different images. If the two users switch positions, the same image continuously is steered toward them.
Retro-Reflective Air-Gesture Display: Sometimes, it’s better to control with gestures than buttons. Using a retro-reflective screen and a camera close to the projector makes all objects cast a shadow, regardless of their color. This makes it easy to apply computer-vision algorithms to sense above-screen gestures that can be used for control, navigation, and many other applications.
A display that can see: Using the flat Wedge optic in camera mode behind a special, transparent organic-light-emitting-diode display, we can capture images that are both on and above the display. This enables touch and above-screen gesture interfaces, as well as telepresence applications.
Kinect based Virtual Window
Using Kinect we track a user’s position relative to a 3D display to create the illusion of looking through a Window. This view dependent rendered technique is used in both the wedge 3D and multi-view demos, but the effect is much more prevalent in this demo. The user quickly should realize the need for a multi-view display as this illusion is only valid for one user with a conventional display. This technique along with the Wedge 3D output and 3D input techniques we are developing at Microsoft are the basic building blocks to build the ultimate telepresence display. This Magic Window is a bi-directional light-field interactive display that gives multiple users in a telepresence session the illusion that they are interacting and talking to each other through a simple glass window.