We started testing with lasers 2 or 3 months before Aura. For our tests, we rented a cheaper laser, since laser rentals are quite expensive, and an EtherDream2 laser controller.
We did several software tests, first using Maxwell; then openframeworks libraries ofxIlda and ofxEtherdream; and a wrapper for vvvv. All of these had problems and did not work out of the box with the EtherDream2. Maxwell wasn’t supporting Etherdream2 properly, after a few emails with the creator of the application we managed to get it working. The openframeworks libs were ancient, required a “port” to a modern version of OF and only worked on Mac. The wrapper for vvvv was useful to have a .net version of the system, it compile but it still wasn’t detecting the Etherdream 2, we think the library only supported the first version of Etherdream and something changed for the 2. Contacting Etherdream 2 guys was useless, they didn’t reply to any emails we sent. The documentation only isn’t very accurate either, can find lots of information on the first Etherdream but not the second. We ended up making the mac openframeworks version work and then ported it to windows to make a managed code library for .net out of it. In the end, we created a Bonsai node for the laser controller.
We did some tests with shapes and animation, mostly circles and lines. Creating flower shapes, circles, circles within circles, cogs, etc. In the end, we concluded that simple was best, so we used 2 circles.
We also did some tests with the mirror and the structure required to hold and fine-tune it. We needed to have a large enough mirror to reflect 100 meters away a circle with one meter of diameter. We built a system with metal springs and screws that would hold the mirror floating and be able to tighten one side or the other to tilt it slightly. This system enabled some degree of calibration to direct the reflection of the laser once it was pointed at the center of the mirror.
Another point we tested heavily was the particles we wanted to project on, we tried different types of water sprinklers, dispersers and even smoke machines to find out what would be more reliable and interesting. Let me tell you that the visual effect is quite different between a water disperser and a smoke machine but both are quite interesting to look at.
One of us really wanted to have direct human interactivity in the piece, so we ended up testing a few ways on how to integrate the movement of the people and the number of people to affect the laser pulse. This was mostly reflected on how large the circle was and how fast it pulsated. There was also the concept of shaking the circle when there wasn’t much interaction. Some of these ideas worked, others not so much.
We also had some issues with the budget which didn’t allow us to get the optimum PA system, which would ideally be multiple sources evenly spread out through the 100 meters of the piece. So we decided to split bass and treble sounds on different ends of the piece, which kind of worked with the concept of the piece (observing things from different perspectives, mirror reflection causing other points of view, the immersive and organic nature of the object). There were several iterations of sounds tried out, we originally wanted to have the sounds playing on Buzz or Ableton Live and some interactive parameters controlling filters. In the end, there was a decision to move all the audio to MAX/MSP with some combinations of sounds that would interchange. We ended up with a slowed down heartbeat as the main bass sound, the speed being affected by the motion detection. And a set of different samples used for the treble sound. The bass end of the installation had a proper PA. The treble end of the installation had some custom made horns, made out of megaphone amplifiers and some cone shaped plastic materials we had lying around. Max/MSP would send OSC values to Bonsai to sync the heartbeat volume with the luminosity of the laser. We also wanted to have the loudness of the occasional treble sounds affect the shakiness of the circle, but we ran out of time.
We had already made a few test installations at the entrance of Artica at FCT, with random curious people stopping by to take a look and ask us what we were doing. The festival organizers dropped by during some of those sessions as well to also take a look and give some feedback. The public response was always great from day one we started testing things publically, lasers are mesmerizing by themselves, the moment you increase the scale of the object and start adding other elements it becomes completely entrancing.
Walking amongst the people experiencing the piece at Aura Festival you could hear out some impressed comments and mentions that it was the best thing they’ve seen at the festival so far. This feedback made us very happy. After so many after work hours invested into the project polishing it to its maximum potential under the budget limitations, we were really thrilled to witness its positive impression on the attendees. Thank you to the Festival organizers, our product partners and all the attenders of the festival for such wonderful few nights in Sintra presenting our piece “532nm”.
Concept, Equipment Design:
Eric da Costa
Construction:
Eric da Costa, João Ribeiro, José Noronha
Laser Control Software:
André Almeida, Tarquínio Mota, Ricardo Imperial
Sound Software:
Guilherme Martins
Sound Design:
Filipe Barbosa, Guilherme Martins