Cool take on the Kinect
You draw buttons, and they become key you can play. See the impromptu sound board video for a demonstration of drawing buttons.
Using the Kinect, I can create buttons by just drawing enclosed shapes on a piece of paper! The kinect identifies those regions as buttons that can be activated when I press them with my finger. This program uses ROS, the openni kinect drivers, and software I wrote here at MIT. You can find the code in the mit-ros-pkg repository.
Slightly off topic, but since many of you enjoy sci-fi as much as we do
The holographic device plays a 3-inch projection at 15 frames per second, just shy of movie refresh rates of 24 to 30 frames per second, the MIT researchers demonstrated at the Society of Photo-Optical Instrumentation Engineers’ conference on practical holography.
The red hologram is jerkier and has much lower resolution than the one in Star Wars that sparked the public fascination with 3-D holograms in the 1970s. In fact, it kind of looks like a red blob on a staticky TV. But it’s 30 times faster than a telepresence device created in 2010 by University of Arizona researchers.
“I think it’s an important milestone because they were able to get to 15 frames per second, which is almost real time,” says physicist Nasser Peyghambarian, who led the Arizona research. “The quality is not as high, but hopefully it will get better in the future.”
The key to speed was computational power. The MIT team used a Kinect camera from an Xbox 360 gaming console to capture light from a moving object. Then they relayed the data over the Internet to a PC with three graphics processing units, or GPUs, tiny processors found in computers, cell phones, and video games that render video quickly. The processors compute how light waves interfere with each other to form patterns of light and dark fringes. Light bouncing off these fringe patterns reconstructs the original image. The MIT team used a display to illuminate the computer-generated fringes and create a hologram.
“The students were able to figure out how to generate holograms by using what GPU chips are good at,” says Michael Bove, an MIT engineer who led the research. “And they get faster every year. There’s room for a lot more understanding of how to compute holograms on them.”
For those of you who are curious about the name:
Translating Feline Behavior Into Sound!
Alistair keeps feeding us with nice updates on the Kinect WiiMote solution
After battling against slow laptops and extreme latency, I’ve started turning this demo into a cluster system with different operations being farmed out to different machines, it could be the only audio application that has had it’s latency reduced by sending Midi messages over TCP/IP.
The demo only has one of the instruments connected, the green bar at the side. The little horizontal lines across the bar are the lower stave (bass clef) and can active Midi Notes. The other box does nothing, instead a wiimote is used to tune synth parameters.
Ableton Live is being used as the output device, it’s not quite the right thing for this, a software synth is preferable.
PS. Alistair also tells me this:
Just cracked getting the WiiMote and Kinect and Ableton to co-operate together making music by using midi over IP, will be releasing an open source version in February so people can collaberate making music over the net
The Kinect has now been hacked to allow a user to play a virtual 5 piece drum kit, as well as being able to dance to create music using synthesizers. The software: New Wave Instrument Fabric allows a user to connect both the Kinect and WiiMote up to various Midi devices and control up to 19 midi control channels simultaneously by dancing.
XBox Kinect hooked up to PC doing dubstep music controlling midi software synth with the right hand and hardware synth + fx processor with the right hand. This has to be the coolest thing I have ever touched, that includes the last motorbike I came off. To play it you have to dance; otherwise you lose sync with the oscillators and beats, I can’t believe how #’@@***ing Sik this is to play with!!!!!!!!
Shared by AlistairDBClarkson
Just a quick and dirty demo of Shinect 0.3 alpha on FL Studio.
Shinect is a midi controller that uses Kinect. It’s based on the OpenNI library and written in C++.
You can use 3 axies on both hands to send MIDI data, plus virtual midi pads that you can display on top of the video. It’s also multi-user (2 persons or more in front of the camera)
Kinect is really getting hot, here´s just another example of this
Demoing placing the keyboard wherever you like.
Also, attempting to play a duet (inspired by the movie “Big” – http://www.youtube.com/watch?v=rKrZid…
Made possible by libfreenect (http://openkinect.org) and coded in python
Works at 30fps with no lag!”
British artist and designer Chris O’Shea created this Kinect Air Guitar prototype.
O’Shea explains how the Kinect Air Guitar works
Written in c++ using openFrameworks and openCV for image processing. Using the ofxKinect addon and the libfreenect driver on Mac. Thank you to the openframeworks and openkinect communities for enabling this to happen. A big thank you to Microsoft for bringing this technology to the mass market.
How it works
First it thresholds the scene to find a person, then uses a histogram to get the most likely depth of a person in the scene. Then any pixels closer than the person to the camera are possible hands. It also uses contour extremity finding on the person blob to look for hands in situations where your hand is at the same depth as your body. It only works if you are facing the camera front on. Then it uses one hand as the neck of the guitar, drawing a virtual line from the neck through the person centroid to create the guitar line. The other hand is tracked to see if it passes through this line, strumming the guitar. The neck hand position controls the chord.
The Therenect is a virtual Theremin for the Kinect controller. It defines two virtual antenna points, which allow controlling the pitch and volume of a simple oscillator. The distance to these points can be controlled by freely moving the hand in three dimensions or by reshaping the hand, which should allow gestures that are quite similar to playing an actual Theremin.
This musical instrument demo has been developed by Martin Kaltenbrunner at the Interface Culture Lab at the University of Art and Industrial Design in Linz, Austria. The software has been developed using the Open Frameworks and OpenKinect libraries.
Remarks: Talking to a professional thereminist revealed that there are still some significant improvements needed to simulate the exact behavior of an actual Theremin. This includes the scaling of the frequency control by the power of two and defining a smaller range for the amplitude control. We will soon post an updated version that also synthesizes the original Theremin sound.
It defines two virtual antenna points, which allow controlling the pitch and volume of a simple oscillator. The distance to these points can be controlled by freely moving the hand in three dimensions or by reshaping the hand, which allows gestures that are quite similar to playing an actual Theremin.
This musical instrument has been developed by Martin Kaltenbrunner at the Interface Culture Lab at the University of Art and Industrial Design in Linz, Austria. The software has been developed using the Open Frameworks and OpenKinect libraries.