For the past 2 years the market has more or less been swamped with tablet synthesizers, drum machines, vintage clones, ambient electronic instruments for tablets. At first it was more or less an iOS only business, Android has been struggling for a long time with its user experience and more importantly with its audio latency. But now there is movement also on the Android side with several music creation app developers porting their apps also on to Android, ie FL Studio, Caustic to name but a few.
All this is great in a sense but how much real music production is being done on these devices, sure there are bands and artists that claim that their entire new album has been done on a tablet, but seriously how many go beyond using them for leisure playing or perhaps inspirational journeys into the ambient worlds of many of these glossy and experimental tablet synthesizers. Of course in a studio environment you can hook these tablets up, attach necessary controllers and get it all in your DAW, but is it effective – will it actually transform how we create music, will it replace an acoustic guitar or piano to create a new tune or melody. As of right now it tends to be more an effects generator for adding cool sounds, ambient pads, chip sounds etc. To be honest music creation is much more than machines – in the end no song is better than the melody that forms the base of music creation – tablets will not change this.
Just to be clear, we are not hardware fanatics in any way, most is done in a virtual studio environments. The question we ask, can tablets really take the jump from a toy or perhaps experimental control surfaces, to be effective tools in the music creation process. With crisp graphics and UIs you can run awesome looking vintage clones, like the iMS-20, but still comparing the feeling of twisting a virtual knob to a physical one – the sense of precision is not the same, sure you can hook it up to a keyboard and sure you can get a “real” KORG analog synth for the price of a couple of visits to Starbucks, but it is not the same as dealing with the real thing. But maybe this is good enough for a greater crowd and we have with tablets definitely pushed the envelope on what defines a music production environment. However, what can be seen now is that we only get more of the same – more ambient synths, more vintage clones, more cheap drum machines and sequencers, more experimental control surfaces and hybrid DAWs.
What will be the next step and will we ever take these guys below seriously
We sure do in one sense, but in another it still feels so 2012…
The Laptop Orchestra of Lake Forest Academy plays Radiohead’s “Meeting in the Isle” as they open their Fall 2012 concert. 8 musicians play and project iPads on sound panels behind them while using a combination of apps including Animoog, iMS-20, iElectribe, Reactable, and Garageband.
Sound and science converge in Carsten Nicolai’s installation work, which perform autonomously to embrace the tension inherent in accidents.
For more information:
Electronic artist collective Machina has announced the MIDI Jacket – a wearable MIDI controller, being developed as a Kickstarter project.
The MIDI Jacket MJ v01 is designed to control digital music instruments, computers, and other devices. It allows users to control and make music kinetically (through body movements), through body sensors, and by detecting your acceleration and flexion.
The MIDI Jacket has multiple built-in controllers:
- Four flexible sensors which can detect your finger’s position
- One accelerometer which can detect your arm’s acceleration
- A joystick
- 4 push buttons
All of these sensors and buttons can be configured by the user, but they come with presets and initial configurations. While the sensors are there, the jacket looks like a regular jacket, and can be worn under normal conditions.
Imagine being able to create music by using your body as an interface. Imagine being able to extend that jacket to make it work not only with music, but with whichever devices you have: A Kinect, an iPod, Nike Plus. We believe that the way we interact with our clothing is changing; clothing should not only be a way of covering your body while helping you get laid, our relationship with clothing should be much more than that: clothing should be an extension of our body, and we’re using wearable technology to do that.
We are working with the best musicians, DJ’s and electronic artists worldwide to make this project a success. Our first product is the MIDI Controller Jacket. A MIDI (Musical Instrument Digital Interface) allows communication between digital music instruments, computers, and other devices (for example, using the jacket’s sensors to send music notes to your DAW). It allows users to control and make music kinetically (through body movements), through body sensors, and by detecting your acceleration and flexion.
John Biggs interviews Mike Butera, Founder of Artiphon, onstage at CES 2013. Artiphon is a multi instrument that uses an iPhone as it’s brain. This device can be a guitar, bass, violin, banjo, or drumpad.
An installation for used cassette players which looks on their obsolescence not as an ending, but as an opportunity to reconsider their functional potential. Superseded as playback devices, they become instruments in their own right. Replacing the prerecorded content of each tape with a microphone gives us the chance to listen instead to the rhythmic and resonant properties of these once ubiquitous plastic shells. Binatone Galaxy brings the framework within which a generation purchased their favourite records to the centre of attention, revealing the acoustics of the cassette and the voices of the machines themselves.
Science Fiction Children & Moritz Simon Geist live at CYNETART Dresden Hellerau 2012
The MR-808 is the first drum robot that reproduces the drum sounds of the 80s – in the real world! The robot installation MR-808 is a replica of the famous 1980s electronic drum machine TR-808 – with robots playing the drum sounds by Moritz Simon Geist.
Read on: sonicrobots.com/mr808-eng/
Artist Blog, Livedates: sciencefictionchildren.com
MR-808 – mechanic sound robot (all drums, miced)
A mechanic relay controlled via arduino (bass sound)
Gameboy – Arduinoboy hardware (8 bit chiptune sound)
Everything was programmed in Ableton, only equing and compression has been applied.
Cameras: David Campesino, Konstantin Rinner hochkultur.com/
Music: Moritz Simon Geist
Production: Art Hustle
Background description below:
Here is a demonstration of a CV controller that I built. It is a simple pendulum with a magnet at the end. The magnets on the table can be moved and can either repel or attract the pendulum. More information can be found at www.artoftravelogue.com or more specifically here: http://artoftravelogue.blogspot.com/2012/02/magnetic-table-cv-controller.html
As falling sand interrupts the flow of a laser to a light-sensitive sensor (a photodetector), the circuit produces random oscillations of sound. It’s the latest brilliant creation of the Dutch scientist Gijs Gieskes, the industrial designer-turned-musician whose inventions often center on some physical and mechanical apparatus. Just for good measure, the project is mounted to a clear frame so it can be fit to a Eurorack modular setup.
BeetBox is a simple instrument that allows users to play drum beats by touching actual beets. It is powered by a Raspberry Pi with a capacitive sensing board and an audio amplifier in a hand-made wooden enclosure.
BeetBox is primarily an exploration of perspective and expectations. I’m particularly interested in creating complex technical interactions in which the technology is invisible—both in the sense that the interaction is extremely simple and in the literal sense that no electronic components can be seen.
The enclosure was created from .5″x8″ poplar boards, which I cut to size and finished using various hand and power tools. I used a router for both the edge details and for grooves in which to conceal the wires, and a drill press to create the speaker grill and to bore holes for the beets with a hole saw. I then stained the wood and, after assembly with wood glue and a nail gun, sealed the enclosure with polyurethane.
Touch sensing is handled by an MPR121 Capacitive Touch Sensor from SparkFun, for which I ported existing Arduino code to Python. This board communicates with a Python script on a Raspberry Pi via I2C. The script watches for new touches and triggers drum samples using pygame. Audio from the Pi’s line out is run through a small amplifier I built using an LM386, which is based on a circuit straight from the data sheet. The amp is connected to a salvaged speaker mounted under the holes in the lid.
Source code for the BeetBox is viewable on GitHub.
Learn more at scott.j38.net/interactive/beetbox.
The first test of the Magic Ceramic Theremin lamp. It is developed as a peculiar piece for the opening of the Exibition of several ceramists in Gallery Artibrak. from november until 28th of December 2011.
A theremin is normally stepless, but in this case an A-156 is used as a quantizer..
Small explanation for those who are not familiar:
In this magic piece of ceramic two antenna’s are integrated. One antenna for the volume and one for the pitch. The instrument does not have to be touched. The volume can be controlled by your left hand (when approaching it the volume increases), the pitch can be controlled by your right hand (when approaching it, the pitch of the sound goes up). By approaching the antenna’s you are influencing the potential difference. Just like the antenna of your transistor radio which functions well or not when approaching it.
The Magic Ceramic is based on the original Theremin invented by Léon Theremin in 1919. That electronic instrument is stepless variable and very expressive. It sounds like an opera voice or violin.
This ceramic version of the instrument has a much more variety than it’s original. You can make the sound stepless or let it be quantized, so that you hear a real tonescale/musical scale. The sinus-tone of the Magic Ceramic is quantized (chopped in pieces), and in the way it is presented now only the Minor notes are heard. (like the black keys on a piano) In this way the steps between the notes are bigger and easier to distinguish. Other possibilities are just Major notes or the complete tonescale, quantized or not. There is also a small sampler added, so when you reach the highest note a spoken voice can be heard.
more info here: