The use of lasers in music is not new and artists like Jean-Michel Jarre has been experimenting with it for years. But things are progressing and many artists are now now leaning to more autonomous systems – creating their own code to have the audiovisual performances behave in a certain way. A really astonishing example of this is Epsills. It is the creation of multidisciplinary Barcelona-based AV team Playmodes.

Espills is a solid light dynamic sculpture. Built using laser beams, laser scanners and robotic mirrors, it is inspired by crystalline formations. A set of geometric figures that float in the air and which suggest, in an abstract way, the transmutation of matter from chaos to order. Dust becoming crystal, being eroded and becoming sand again.

Each visual representation integrates its own sound design through sonification algorithms that transform light into music, completing this alchemical landscape.

Espills is also a digital handicraft exercise. Most of the elements that make up the work, both hardware and software, have been designed and built by ourselves: robotic mirrors, light drawing tools, laser modules, audio synthesizers, scenic media… All this engineering was put to the serive of creating a playable live instrument.

The project is still ongoing and many things remain to be discovered in the process, below you will get a sense of their ambitions with Espills in their own words:

“This is an ongoing research, and the piece haven’t reached its final form yet. At this point, we managed to create a realtime audiovisual instrument that can be played live, and we made the first explorations with it, giving as a result the different scenes you can see on the video documentation. Nevertheless, we understand there’s a bigger potential on the expressiveness of the instrument, and the project will still evolve through future iterations.”

As mentioned in the beginning they have also developed their own code. It is a visual programming framework, OceaNode, a kind of home-brewed solution for imagining banks of modulation as oscillators, a visual motion synth of sorts. OceaNode then sends control signal to Reaktor. And they made a VST plug-in to send OSC from Reaper, so they can automate OSC envelopes using the Reaper timeline.

A sonification engine, built with the Reaktor audio programming environment, receives OSC data from OceaNode. This data is then mapped to audio parameters, transforming laser motion into sound:

OceaNode is openSource, you can download it and contribute at: https://github.com/playmodesStudio/

This instrument is a customization of an ensemble that can be freely downloaded from the Reaktor’s user library: https://www.native-instruments.com/es/reaktor-community/reaktor-user-library/entry/show/9717/

The above SoundCloud track represents a discarded recording from the team. As they say: “Although the exploration led us to finally create a realtime audiovisual generative instrument, in a previous stage we believed it was better to timeline a narrative approach. We even created a whole soundtrack that we finally discarded because we felt the approach was much more powerfull by going realtime.”