Custom software enables gestural control using Kinect and PC. Hand movements in the X and Y axes are translated into MIDI signals which generate control voltages, allowing 2 dimensional morphing of waveforms in a “Morphing Terrarium” module, processed through modular moog. Recorded in the superterranian lair of the Robotmakers.
Mark Mosher explores Space Palette, details below:
Tim Thompson flew out from San Jose to present on his Space Palette kinect-based casual instrument this week Boulder Synthesizer Meetup (meetup.com/The-Boulder-Synthesizer-Meetup) on Nov 13th, 2012.
dependent parts and taught me how to launch, calibrate and customize. After he flew home, I tore the system down and put it all back together and got it all running on my own :^). I then swapped out the audio instruments behind the scenes with Absynth patches to make my first solo performance.
I shot this video while holding the camera behind the frame as I played. All audio and visuals are interactive and in response to the Kinect tracking my multi-touch movement (X,Y, Depth) through the various windows in the frame.
“Another simple experiment…
i use synapse to control devices on Ableton Live, but i wanted to control another midi hardware from live, like my boss gt8 guitar processor, on this example the Y and Z positions of my right hand are changing the cutoff frecuency of the vc highpass and lowpass filter of my iMS20 respectively, the messages are routed with maxforlive devices (to avoid osculator and have automation with live) and sent via midi over wifi.”
Variations II of Variations II : May 14 – 18 2012
Installation at Integrated Media Studio Valencia CA
Variations II of Variations II is a kinetic sculpture inspired by John Cage’s Variations series. Cage’s Variations II is a graphical composition that generates musical events using measurements of distance between dots and lines on a piece of paper.
In my realization of Variations II, the instructions for the piece determine the behavior of rotating panels and images synchronized to be projected onto the sculpture. Motors drive the rotation of the panels, and are used as a sound source for the audio portion of this piece.
My next goal for this project is a live interactive audiovisual performance. The performers will function as a variable in a feedback loop between performer, sculpture and score.
work in progress
C/C++, Arduino, MaxMSP/Jitter,Stepper Motor
A performance by Tim Thompson on the Space Palette at STEIM (see steim.org) in Amsterdam. This three-dimensional instrument uses the Microsoft Kinect to translate hand motion directly into both music and visuals. It’s like having four 3D mousepads in mid-air. See http://spacepalette.com for more information. Thanks go to Vivian Wenli Lin for the recording.
The Space Palette is a Kinect-based instrument used to perform music and graphics simultaneously, controlled directly by your hands. The picture above shows the newest version, which you can see in action in this video of an open house performance on the Space Palette. The new version also appeared at the Sea of Dreams 2012 New Year’s Eve party in San Francisco. At that event, it was only doing graphics, because it was too loud to do anything else. The previous (rectangular) version of the Space Palette can be seen in this video of people playing with it at Burning Man 2011. Search YouTube for “Space Palette” to find other videos.
Source code and a Windows executable for the core bit of software (that talks to the Microsoft Kinect, recognizes your hands within an arbitrary flat surface with holes, and sends out TUIO/OSC messages that can drive whatever you want to control) is available here. If you’re interested in a more recent version of that software, send email to me at timthompson.com.
In this video I briefly explain and demonstrate how I used the Kinect to control the massive Melbourne Town Hall Organ. It contains a short excerpt from our performance “Carpe Zythum” November 2011.
My project blog: http://chrisvik.wordpress.com
I’ve created my own software “Kinectar” (http://kinectar.org), which allows the use of the Kinect to control MIDI devices, ie. playing notes through simple gestures and motion. The Melbourne Town Hall Organ got a referb in the late 90s adding the ability of MIDI messages to active the notes… and so, this happened.
The company that provided the control system to allow the organ to be played via MIDI are “Solid State Organ System” and can be found here:
Co-composition and Kinect performance and programming by Chris Vik
Co-composition, lyrics and vocals by Elise Richards
Video produced by Unkle Nicnac Films (unklenicnac.com)
A kinetic sound installation investigating the perception of sound and space. Tessel is constituted of a suspended and articulated topography of 4 x 2 m, subdivided into forty triangles. Twelve of them are fitted with motors and eight are equipped with audio transducers, which transform the surface into a dynamic sonic space. A dialogue between space and sound is created through this sculptural “choreography”. Our perception is altered, as the surface slowly modifies its shape.
From Tinguely’s poetic machines to Alexander Calder’s mobiles or Buckminster Fuller’s synergetics, Tessel combines influences that question the link between geometry, movement and chaos, thus continuing the quest for beauty in the synesthetic perception of sound and spatial phenomenons. Its name is derived from “tessellation”, a term applied to the geometric subdivision of a surface into plane figures, also known as “tiling”. It also describes a software technique that allows calculation of renderings through the subdivision of surfaces into polygons. The term has its origin in the Latin word “tessella”, describing the square tiles used to make mosaics.
Tessel is a collaboration between French composer and artist David Letellier, and LAb[au], Belgian electronic arts studio. Tessel is a co-production of the galleries MediaRuimte (Brussels) and Roger Tator (Lyon), realised with the financial support of Arcadi, Dicream and the Commission des Arts Numériques de la Communauté Française de Belgique.
Johannes Kreidler, a musician and artist known for inventive experiments in music, shares his studies for the Kinect, which he terms “conceptual music.” A solo “for violin” can involve literally waving a violin around. “House music” can mean making music whilst ironing a shirt. Any gesture in space becomes musical. Without tangible feedback, that can be challenging, and since these are just gestures in air, precision and nuance may not be a strong suit.
And here is another interesting Kinect video:
Experiments in balloon motion and sound using an MS Kinect depth sensing camera.
Created for the Carnegie Mellon 1st & 2nd year MFA Graduate show entitled “Fresh Baked Goods” at Bakery Square, April 2011.
A machine stands in a room surrounded by balloons. Circulating fans blow the balloons over the machine which creates sound based on their movements.
Here is a nice video showing how Kinect from Microsoft really can have a play in music production:
A brief video showing Tension. An interactive spatial sound installation for multiple users. A person enters the space and a generative sound is assigned to that person. The sound pans around in the 6-channel speaker system following the user in the space.
Up to 5 users can use the installation at the same time. Each person modifies the other sounds based on the distance to the other users. The closer you are to other people the more the tension in the sound increases.
This is the configuration of Synapse Kinect hack Ableton
The right hand Y (LFO) and Z (Tune) Bass Virus TI patch Raw and the left hand stutter iZotope plug combined with the right hand ..
Visuals by Quartz Composer synapse hack ..
Synapse is an app for Mac and Windows that allows you to easily use your Kinect to control Ableton Live, Quartz Composer, Max/MSP/Jitter, and any other application that can receive OSC events. It sends joint positions and hit events via OSC, and also sends the depth image into Quartz Composer. In a way, this allows you to use your whole body as an instrument.
Check out this video for an explanation of how this is done: http://www.youtube.com/watch?v=teHCHsjxI00
This is an example performance of what you can make with the Synapse for Kinect tools combined with Ableton Live and Quartz Composer. You can have this running on your computer within minutes! Head to http://synapsekinect.tumblr.com to download.