Anaglyph 3D (red and cyan) glasses needed. 3D effect appears to work better further back from the screen. Audio & Visuals recorded live @ Westside Welding and Machine 2011 September 8th using primarily the LZX Visionary video synthesizer system. All sounds and visuals recorded live in real time with no editing and no pre-recorded material. Please visit http://brizbomb.com/
And here are the details:
PHYSYNTH turns your iPad or iPad 2 into a piece of real vintage synth hardware.
Powered by next-generation 3D graphics technologies, it is a stunning, beautiful device that will enable you to weave beautiful, fluid ‘Soundscapes’.
PHYSYNTH uses a state-of-the-art physics engine to trigger sounds using four real simulators, you charge physical objects with sound and collide them with other objects to trigger them. It is an entirely new way of creating music, a natural and fluid way to express yourself with a wide range of beautiful, realistic instruments.
• Custom shaders especially designed for iPad2 and the beautiful high definition full-3D interface pushes the limits of what your device can do.
• iPad 1 owners are not left out, Physynth was designed to take full advantage of everything the device has to offer.
• Layer your sounds with Four-track Soundscaping.
• Jam with Realtime melody or rhythm over-dubbing.
• The physics-triggered sample engine with user-adjustable parameters means endless scenarios.
• Enjoy a wide range of beautiful sounding instruments with more coming in regular updates.
• Express yourself with full mixing control including full stereo panning, volume and digital special effects.
• Vintage hardware design with groundbreaking 3D tilt camera, stunning lighting and unbelievable next-gen graphics make Physynth the app to show your friends.
• Melody-mode to allow the user to play Physynth instruments like a traditional keyboard or drum pad.
• Headphones are recommended for full stereo immersion and realtime panning.
• View the TV advert now at: physynth.com
• Crafted by Simian Squared: simian2.com
Blip Shaper Walkthrough
a) creating percussive patterns with monome b) shaping the individual sounds that make up the patterns with multitouch gestures c) recording touchscreen gestures as automation d) storing, duplicating and navigation patterns e) recording the resulting audio to a dynamic buffer f) manipulating the buffer with a multitouch cut-up approach g) visualizing everything with dual screens
“Combining sophisticated, beautiful visualizations, elegant mode shifts that move from timbre to musical pattern, and two-dimensional and three-dimensional interactions, it’s a complete visualization and interface for live re-composition.”
A ‘fly on the wall’ video of a personal StringStation demonstration and talk about the 3D instrument by musician/inventor Jim Bartz.
It further introduces an intricate and truly amazing 3D resonance factor that reveals an atmospheric, dimensional soundfield never known before. Fingertips on strings instantly have a detailed nuance and dynamics allowing extreme expressiveness in music and sound creation- The StringStations 40-strings are programmed in distinct groups (bass, melody, rhythm, orchestration…) that when captured together with its innovative surround pickup matrix, pulls pure 3D tone resolution that delivers an ultra wide, exceptionally dimensional panorama. It captures in vivid 3D, a ‘fractal lattice of interpolations’. The powerful sonic force of its natural string tones layered with any sound imaginable via digital sound modeling and computer control are the beginning of an extreme level of surround music innovation…
Here’s his statement of the use of reverbs
I have a lot of reverb units – I love them! They fascinate me because they take your sound and they put it into a 3D space, and they all do it in their own unique way. They have evolved so much over the years, starting off with analog springs and metal plates, through early and very clever digital algorithmic processors and onwards through the growing bit depths. My favourites are the EMT 140 stereo plate, the Lexicon 224, Yamaha Rev1 and the Telefunken Echomixer mono spring thing. I bought a Roland R880 years ago really cheap which was their attempt (in 1988) at a supa-mega-reverb and it is an extremely underrated unit actually. So I took it home so I could experiment putting a Buchla through it. I had a hunch that they would sound good together! I spent a few hours re-aquainting myself with its shocking operating system (which was like that scene in Contact where they build the spaceship based on blueprints deciphered from an alien language downloaded from outer space)
Anyway – here is a live tweak of the Buchla and R880 together
The new 40-string 3D instrument. 3D Recording Artist/Engineer Jim Bartz on the path to deliver a fully functional prototype of the beautiful StringStation… revealing an advanced, original 3D sonic architecture of endless possibility.
Advanced 40-Stringed Controller Instrument …
– Computerized surround sound pickup matrix under strings at harmonic hot-spots harnessing three dimensional audio.
– Computer controlled and programable 3D resonance of 40 strings creates unique sonic fractals.
– Digital Sound Modeling allows layering of natural string tones with custom modeled sound waves …wide open pallet.
-Exceptionally sensitive to tactile nuances of human touch, offering a hyper expressive dynamic play surface unlike any other resonant instrument.
Kraftwerk play the rather eerie Radioactivity, this is shot in 3D , View from the front.
Manchester 2009 Start of the 3D show , you can tell from the crowds reaction just how good the 3D show was.
Kraftwerk will give further concerts with 3D theme in the fall on the 12 and 13 October – giving three concerts in Munich. Yes, you read that right! Three concerts in two days – the 13 is done including two concerts, 20.00 respective 00:00. In turn, a 3D exhibition held on 15 October.
Some more details on the AudioGL project.
This video features a patching demonstration, and an entire track written with one monophonic oscillator.
Tracks are available at audiogl.bandcamp.com
AudioGL, a project teased in videos first in April and then again last week, is a new concept in designing a user interface for real-time music creation. Visuals and sound alike are generative, with the rotating, 3D-wireframe graphics and symbolic icons representing a kind of score for live synthesized music. The tracks in the video may sound like they’ve been pre-synthesized, polished, and sampled from elsewhere, but according to the creator, they’re all produced in the graphical interface you see
This video is a must see, really cool performance using Kinect technology.
Extending the acoustic vibraphone by embedding 3D gesture recognition using the Kinect. Kinect-Max/MSP (Kinect Datastream-OSC-MIDI)- controlling filter parameters in Ableton Live to modulate the acoustic audio input from the vibraphone in real-time. Developed by Gabrielle Odowichuk and Shawn Trail @ MISTIC.
Is collaboration between humans and robots possible in music?
Now the answer is yes.
This is a live performance-demonstration of an innovative research project made by Italian Institute of Technology (IIT – Advanced Robotics Department), mixing electronic music, a robot arm as a music interface, and the experience of interactive 3D projections. The robot interacts with the artist to create and modify musical parameters during the performance. On the background, 3D figures represent the robot trajectories, figuring musical parameters.
Artist: Valerio Solari (K)
Credits: Victor Zappi – Antonio Pistillo – Sylvain Calinon – Andrea Brogni – Darwin Caldwell
Project Abstract: “The availability of haptic interfaces in music content processing offers interesting possibilities of interaction between human users and novel instruments and controllers for musical expression. The capability to precisely modulate haptic feedback on the user establishes a direct connection with the sonic output, which enhances the modalities of artistic content creation. The use of robots as both manipulators and actuated interfaces o ers new perspective in a combined context of human-robot interaction and music playing. With this project we investigate
the use of a compliant robotic arm as a bidirectional tangible interface for musical expression. We exploited the robot capabilities to con gure a system in which a human user and the robot interact for the creation of recursive modulations of music parameters. An experimental session has been carried out with the collaboration of a musician, who performed with the robot as part of his live stage setup.”