Brain, Wider than the Sky

Brain, wider than the Sky

Brain, Wider than the Sky

Fundação Calouste Gulbenkian

Rui Oliveira

Project leader, Interaction design, Software development
Neuron Sculpture detection and interaction
André Almeida

Rigging, Electronics, Setup
David Palma

Neuron Sculpture project leader
Neuron Sculpture vision, concept, sculpture
Eric da Costa

Interactive applications art direction
3D Models, UI design, Brain Orchestra Sound design
Neuron Sculpture Sound design
Filipe Barbosa

Time to Action Author Software development, Project Advisor
Gonçalo Lopes (NeuroGears)

Light equipment design and assembly, Neuron sculpture interactive sound software
Brain orchestra software developer
Guilherme Martins

Neuron Sculpture fibreglass
Neuron Sculpture assembly
João Ribeiro

NeuroGears, Neuron Sculpture detection software
Project Advisor
João Frazão

Neuron Sculpture fiberglass
Neuron Sculpture assembly
José Pedro Sousa

Interactive applications Software developer
Ricardo Imperial

Neuron Sculpture software development
Neuron Sculpture light/electronics development
Interactive applications software development
Tarquínio Mota

Leonel Caldeira

Project full disclosure:


Toppi, the Sweet Artist

End Customer
Nestlé Portugal

J. Walter Thompson Lisboa

J. Walter Thompson Lisboa

Artica Creative Computing

Robot Concept and Extruder Design
Eric da Costa

Eric da Costa
José Pedro Noronha
João Ribeiro
André Almeida
Guilherme Martins

Software Development
André Almeida
Tarquínio Mota
Ricardo Imperial

Tarquínio Mota
André Almeida
Guilherme Martins

Interaction Design
Nuno Lourenço
André Almeida

Sound, Motion Graphics, Robot animations
Filipe Barbosa
Nuno Lourenço

Project full disclosure:

Procedural Florest

Procedural Florest

At Aura Festival this year we participated with an interactive installation called “Procedural Forest”.

At first, we wanted to bring natural elements to the participants, just because when we think in Sintra, all that comes to our minds is its beautiful forests and woods. We wanted also that visitors could easily engage and interact with the installation, to allow every single person to make a difference, and to see its action being reflected on the installation.

Our ‘Assisted Performer’ was a great candidate, because it allows visitors to interact with “anything” using their mobile devices, without having to install any dedicated App. Visitors only need to connect to our Wifi network and open a browser and a GUI will pop.

In this case, a coloured tree would pop on the installation and the user would have to interact with a slider and two buttons. The slider positioned the depth of the tree, the upper button would skip to another tree and the lower button would populate this tree. After this step, another tree would pop and so on. One could identify itself by the tree colour, that corresponded to the colour being displayed in the app.

We ended up using Unity3D which is our election tool for real-time interactive installations because of its optimized streamline and easy integration with our platforms. This video shows a brief stress test to see how much FPS we could have with a maximum of trees being generated.

It was an honour for us to have our installation at MUSA, Sintra’s Art Museum. This museum is the entrance of the festival and the first spot on the road map. It’s always a great experience to see people interacting with our creations!

Concept: Artica

Art Direction: Guilherme Martins, Filipe Barbosa
3D Modeling: Filipe Barbosa
Unity3D Visual Design: Filipe Barbosa
Unity3D Software Development: Ricardo Imperial
Assisted Performer Software: Filipe Cruz

Popota Xmas

Popota Xmas

In 2017 we were invited by UAU to participate in Popota’s Christmas Show, that took place in Campo Pequeno, Lisbon.

It was a big challenge due to the requirements of the performance:

  • Ultra-Large LED Wall
  • 40 minutes of continuous video
  • Videos triggered in sync with the actors/performers
  • Real-time interactive contents

Having this, we created all the visual contents plus the scene props.

The visuals were created 3D Max, edited and rendered in Unity 3D, then post-processed. Real-time interactive graphics generation was also developed in Unity 3D.

During the creative process, we wanted to have a glimpse of the size relations between the screens and the stage, so we developed a VR stage.

Our Einstein VideoPlayer had to be updated to run large video dimensions with high performance, to achieve this peak performance we used HAP codec. Einstein also received real-time video from a computer vision server running Bonsai linked to Unity, and finally connected to a Spout server (if you are a MAC user, Syphon does the same job), using HAP also allowed us to use multi-layered contents with alpha transparency videos.

This is an image of the two control screens, the left screen is showing an infra-red camera image being analysed in Bonsai, the right screen is our EinsteinVP.

We also created the props, this included a big school rubber, pencils, compass, pencil sharpener and a gramophone.

Popota is a SonaeSierra’s brand.

Visual Art Direction:
Guilherme Martins

3D Modeling:
Filipe Barbosa

Video Creation:
Guilherme Martins, Filipe Barbosa, João Ribeiro

Software Development, Realtime Interaction:
André Almeida, Gonçalo Lopes, Ricardo Imperial

Software Development, Einstein Video Player:
Guilherme Martins

Props Concept and Direction:
Eric Costa

Props Construction:
Eric Costa, José Noronha, Paula Espanha



Artica is the mix of art and technology, and we had been feeling like we only been doing tech work and neglecting our artistic side lately, so Aura Festival was a great opportunity to submit something different.

We did a couple of trips to Sintra to check out the possible locations and brainstorm ideas for things with impact. After we organized some ideas we contacted the organizers of the festival to try and feel out what would make more sense to officially propose. We had originally conceived projects aimed at Quinta da Regaleira in particular, but it seemed to already be reserved for Oskar&Gaspar, so we abandoned those and rethought some concepts for the entrance of MUSA and the entire Avenida Heliodoro Salgado. After some back and forth with the organizers we ended up relocating our assigned space to Avenida dos Combatentes da Grande Guerra with a smaller version of what we were envisioning but still plans to make it look (and sound) awesome.

Our final designated location, Avenida dos Combatentes da Grande Guerra, had some advantages and disadvantages, comparing it with the previous locations. On one hand, the location now allowed us to save some budget on the structures required to hold the water dispersers since we could now use the trees to hold them in place. On the other hand, we now had sound from another installation interfering with our piece. Luckily the team from “You Are Here”, which were installed at Biblioteca Municipal, were quite aware of the issue and very approachable during the first day of installation, and we managed to find a middle ground in terms of volume that would allow both pieces to cohabitate without starting a volume war.

The final piece was never truly closed until the very last minute. Our work method is very Darwinistic, in the sense that it’s the best idea that always wins. So we all try to prove to each other that our personal vision is the one that makes more sense for the piece. When you reach the subjectiveness of art it becomes harder to manage since all ideas can make sense (as long as you contextually justify them), but we still needed to narrow things down into a single option to present to the public.

Here is a photo featuring our piece, taken by Hugo Grilo, a participant on the Festival’s photo contest.

author: Hugo Grilo

We started testing with lasers 2 or 3 months before Aura. For our tests, we rented a cheaper laser, since laser rentals are quite expensive, and an EtherDream2 laser controller.

We did several software tests, first using Maxwell; then openframeworks libraries ofxIlda and ofxEtherdream; and a wrapper for vvvv. All of these had problems and did not work out of the box with the EtherDream2. Maxwell wasn’t supporting Etherdream2 properly, after a few emails with the creator of the application we managed to get it working. The openframeworks libs were ancient, required a “port” to a modern version of OF and only worked on Mac. The wrapper for vvvv was useful to have a .net version of the system, it compile but it still wasn’t detecting the Etherdream 2, we think the library only supported the first version of Etherdream and something changed for the 2. Contacting Etherdream 2 guys was useless, they didn’t reply to any emails we sent. The documentation only isn’t very accurate either, can find lots of information on the first Etherdream but not the second. We ended up making the mac openframeworks version work and then ported it to windows to make a managed code library for .net out of it. In the end, we created a Bonsai node for the laser controller.

We did some tests with shapes and animation, mostly circles and lines. Creating flower shapes, circles, circles within circles, cogs, etc. In the end, we concluded that simple was best, so we used 2 circles.

We also did some tests with the mirror and the structure required to hold and fine-tune it. We needed to have a large enough mirror to reflect 100 meters away a circle with one meter of diameter. We built a system with metal springs and screws that would hold the mirror floating and be able to tighten one side or the other to tilt it slightly. This system enabled some degree of calibration to direct the reflection of the laser once it was pointed at the center of the mirror.

Another point we tested heavily was the particles we wanted to project on, we tried different types of water sprinklers, dispersers and even smoke machines to find out what would be more reliable and interesting. Let me tell you that the visual effect is quite different between a water disperser and a smoke machine but both are quite interesting to look at.

One of us really wanted to have direct human interactivity in the piece, so we ended up testing a few ways on how to integrate the movement of the people and the number of people to affect the laser pulse. This was mostly reflected on how large the circle was and how fast it pulsated. There was also the concept of shaking the circle when there wasn’t much interaction. Some of these ideas worked, others not so much.

We also had some issues with the budget which didn’t allow us to get the optimum PA system, which would ideally be multiple sources evenly spread out through the 100 meters of the piece. So we decided to split bass and treble sounds on different ends of the piece, which kind of worked with the concept of the piece (observing things from different perspectives, mirror reflection causing other points of view, the immersive and organic nature of the object). There were several iterations of sounds tried out, we originally wanted to have the sounds playing on Buzz or Ableton Live and some interactive parameters controlling filters. In the end, there was a decision to move all the audio to MAX/MSP with some combinations of sounds that would interchange. We ended up with a slowed down heartbeat as the main bass sound, the speed being affected by the motion detection. And a set of different samples used for the treble sound. The bass end of the installation had a proper PA. The treble end of the installation had some custom made horns, made out of megaphone amplifiers and some cone shaped plastic materials we had lying around. Max/MSP would send OSC values to Bonsai to sync the heartbeat volume with the luminosity of the laser. We also wanted to have the loudness of the occasional treble sounds affect the shakiness of the circle, but we ran out of time.

We had already made a few test installations at the entrance of Artica at FCT, with random curious people stopping by to take a look and ask us what we were doing. The festival organizers dropped by during some of those sessions as well to also take a look and give some feedback. The public response was always great from day one we started testing things publically, lasers are mesmerizing by themselves, the moment you increase the scale of the object and start adding other elements it becomes completely entrancing.

Walking amongst the people experiencing the piece at Aura Festival you could hear out some impressed comments and mentions that it was the best thing they’ve seen at the festival so far. This feedback made us very happy. After so many after work hours invested into the project polishing it to its maximum potential under the budget limitations, we were really thrilled to witness its positive impression on the attendees. Thank you to the Festival organizers, our product partners and all the attenders of the festival for such wonderful few nights in Sintra presenting our piece “532nm”.

Concept, Equipment Design:
Eric da Costa

Eric da Costa, João Ribeiro, José Noronha

Laser Control Software:
André Almeida, Tarquínio Mota, Ricardo Imperial

Sound Software:
Guilherme Martins

Sound Design:
Filipe Barbosa, Guilherme Martins


O Tempo, de Adriana Queiroz

TEMPO de Adriana Queiroz 

“O Tempo de um passo,
O Tempo de um compasso,
O Tempo de um poema,
O Tempo de uma emoção,
O Tempo de um tema,
O Tempo destas gerações pós-guerra que ainda hoje parecem controlar o tempo de melodias para sempre enraizadas nas nossas memórias.”

é um projeto musical concebido por Adriana Queiroz, no qual a cantora se debruça sobre a música francófona através dos seus cantautores mais representativos.
Dando especial destaque a Jacques Brel e a Leo Ferré, viaja-se pelo mundo emocional de Barbara, o encantamento de Trenet, a loucura de Gainsbourg, o surrealismo de Boris Vian, a intemporalidade de Piaf, que não sendo cantautora é uma figura incontornável da música francesa do século XX. Adriana Queiroz estará acompanhada ao piano por Filipe Raposo.

India Song (Marguerite Duras | Carlos Alessio
concepção | voz Adriana Queiroz 
arranjos | piano Filipe Raposo 
apoio vocal Luis Madureira 
sonoplastia Adriana Queiroz | Antonio Pinheiro da Silva 
luzes Helena Gonçalves | Pedro Mendes
banco imagens Tiago Guedes e Frederico Serra | Nucleo Casulo 
participação filme Adriana Queiroz, Claudia Jardim, Sandra Rosado Felix Lozano, Ivo Canelas, Paulo Pinto, Romeu Runa 
vídeo Artica | Guilherme Martins
Filmado no Teatro Camões por Marco Arantes edição (India Song) Alexander David

Avengers STATION Exhibition

Avengers STATION Exhibition

Through our partnership with Audience Entertainment we helped develop certain components of a large interactive exhibition produced by Victory Hill Exhibitions for Marvel. The Avengers S.T.A.T.I.O.N. Exhibition was installed in both Paris and Las Vegas and is planned to have a small tour through the world ( Singapore in October). The exhibition features the history and lore of the Avengers characters, with a tour through the different characters, their powers, their suits, etc. We have collected a few press clips: 1 2 3 4 5

The modules we developed for the exhibit were the final game and the debriefing room which visitors can experience at the end of the exhibition tour. This project came to us through our partner Audience Entertainment, we worked in tight collaboration with Victory Hill ExhibitionsAcoustiguidetech Las VegasDiablo Sound and Photamate to integrate it as smoothly as possible with the rest of the exhibition systems.

The most immediate concern was the visual aspect of the game, playing on 6, person tall, 4K vertical screens. The game is only 3 minutes long, including the game mechanics explanation, and it was developed having in mind recent movie lore of Avengers.

When a visitor reaches the final game area they are requested to swipe their rented iPods running the exhibition application developed by Acoustiguide on a beacon and the players are divided into teams, each assigned its own Avenger, and they use their special controls to attack Ultron and his sentries.

Our developed game required an interface with the local control system to trigger the start and end of the game, to handle light system interaction and the opening and closing of the doors. When the game is over the wall of screens opens revealing the exit where another videowall screen has an application playing a congratulatory video and displaying the Avengers STATION ID cards of the players who just finished the final game. The IDcards are generated by the Photomate server, we designed a simple API with them to handle the creation and retrieval of the ID cards using minimum bandwidth possible.

Victory Hill Exhibitions

Victory Hill Exhibitions

Artica Creative Computing

Interaction Design
André Almeida, Guilherme Martins

Game Development
André Almeida, Alexandre Costa, Filipe Barbosa

Video Intro
Guilherme Martins

Amoreiras 360º Panoramic Elevator

Amoreiras 360º Panoramic Elevator

In September 2015 we did a project for State of the Art to develop a few installations to celebrate 30 Anos Amoreiras. Amongst this installation was an elevator simulator, decorated with TV screens and LED strips. The simulator was promoting the upcoming opening of a 360 Panoramic View in the top of Amoreiras Shopping tower.

The project had it’s usual share of setbacks that a project of this size usually has, so we are quite happy to see it operating well and already open to the public.

The elevator is located at Amoreiras Shopping Center, taking guests all the way to 18th floor to enjoy the 360 degree Panoramic View. More information about the viewpoint can be found at

Customer: Mundicenter
Production: ArticaCC
Project Lead: André Almeida
Structure / Interiors Design: Eric da Costa
Construction: Hilário Almeida
Assembly: Eric da Costa, José Noronha, João Ribeiro, André Almeida
Electronics: Bruno Serras, Tarquínio Mota
Video Software: Guilherme Martins, André Almeida
Video Editing: Filipe Barbosa, Guilherme Martins
Sound: M-Pex
Drone Video Capture: SkyEye


Continente Missao Sorriso

Continente Missão Sorriso

We had to create an interactive installation for a road-show campaign that would be travelling through several schools across Portugal to promote Missão Sorriso. The project involved developing a Band Hero clone that could play 3 licensed tracks from Orelha Negra. To bring more impact to the experience we used real instruments as input devices instead of the usual plastic controllers.


The hardware had to be procured and modified to allow for non-musicians to still be able to play the game. We had 4 different instruments.

Drum Set

We purchased a real drum set and stripped it down to a few minimum elements: bass drum, snare drum, tom and hi-hat. This was required for the portability of the drum set during the roadshow.

Next, we coupled to it an Arduino with several piezo sensors. The piezos were placed inside each drum element which was muffled with a plastic material to prevent sound noise and physical damage to the piezos.

The Arduino would register the values and send a WebSocket message to a Node server.

Bass Guitar

The bass guitar was purchased as one of those build at home kits, that saved some work of breaking a new guitar apart.

We actually had 2 bass guitar prototypes, one using frequency analysis of the picks, and the other using actual sensors. The second prototype proved to be both more robust and easy to learn to use, and that’s the one that was used in the end.

The pick was taken apart from a plastic controller of guitar hero. The bass guitar carved up for more space to place it inside along with an Arduino and connections to the triggers placed on the handle bar.

Similar to the drum set, the Arduino would register the values and send a websockets message by ethernet network to a node.js server.


The keyboard was a standard midi keyboard, sending midi values that are converted into websockets messages by our node.js server working as a proxy.


The sampler was a standard MPC. We opened it up to replace the lights with matching colors to the game and remove any buttons without usage. Similar to the keyboard, the MPC sent midi messages that were converted into websockets messages by our node.js server working as a proxy.


In the software department the whole application is actually a webpage with a localhost server.

  • The design is hardcoded to FullHD resolution.

  • Web Audio API is used to trigger sounds.

  • WebGL (via pixi.js) is used to give 3d acceleration to the interactive elements.

To sync the tracks we developed an internal editor to add/move/edit/delete track nodes. It was made to easy scroll back and forth with the mousewheel to check for music synchronism. And it included functionality to import and export as json.

An additional requirement was made to input high-scores from each session and export them to a local file, for the client to analyse after the roadshow was done.

As previously mentioned the sourcecode is open source available under MIT license. Feel free to fork it to add support for frets on fire assets, other controllers, play music directly from youtube / soundcloud, etc.


We delivered the project as a turn key solution. Fully pre-tested and ready to operate by the roadshow crew alone. They were touring several schools during 2 months, we wouldn’t be able to be present during all of the sessions but were available to provide support when required.

All the hardware was ready to plug and play. We provided a technical ryder for the cables, labelled all inputs and ducttapped the unused ports. We also provided an operations and troubleshooting manual and provided replacement parts for the more sensitive components that might not resist the hard life on the road.

We were present on the initial setup sessions to make sure all things worked smoothly. We caught some early malfunctions and had to repair them but overall we were impressed on how well the system held up on the road.

Continente Missão Sorriso


Filipe Cruz, André Almeida

Instruments Hacking & Electronics:
Guilherme Martins

Valentine’s Interactive Hanky

Valentine’s Interactive Hanky

It all started with the idea of creating a workshop around the theme of paper circuits, with great inspiration from the work and research of Jie QuiLiza Stark and others. Above all, the main idea was to transfer skills and knowledge to the team at Casa do Conhecimento. To show how easy it is to create simple things that are both beautiful and can have a great impact. We had this initial idea but we were still lacking a concept on the same level to the technology. The workshop was planned for February, which coincided with the Namorar Portugal (Flirting Portugal) event and all the Valentine happenings, and being Vila Verde the capital of this event we decided to combine the paper circuits workshop with the famous and acclaimed Lenço dos Namorados (Valentine’s handkerchief).

And so it was born the idea of transforming a traditional Valentine’s handkerchief into an interactive object, using different sensors, SMD coloured LEDs, Arduinos and copper tape.

The handkerchief we used was replicated using volume inks to avoid using the original cloth handkerchief.

The circuit was created with a 1:1 printed version.

As it evolved, it became a kind of work that resembled the craftsmanship of jewellery and filigree, an old very acclaimed Portuguese traditional craft.

After demystifying the process of placing the copper tape and soldering the LEDs, the folks from Casa do Conhecimento were able to finish the circuit on their own while Guilherme prepared the Arduino connections.

The circuit completed. We can see on the top left corner a pressure sensor, depending on how hard it is pressed the hearts pulsate with higher or lower intensity. On the right corner, there is a hidden infrared proximity sensor, when the hand approaches the sensor the sea waves light up in sequence. In the centre there is a microphone, the blowing intensity lights the stems and flowers. On the lower left corner, there is an LDR sensor which makes the heart LEDs react to changes in luminosity. Finally, on the lower right corner we have a capacitive sensor, touching it with a finger makes the letter light up and triggers on a small speaker the sound of a little bird.

For each of these sensor and actuator systems, we used a Motoruino (our very own Arduino clone). You can see the wiring mess underneath, with such a tight deadline there was no time to arrange the backstage properly.

Back to Valentine’s handkerchief, the setup was already completed so all there was left was to showcase it to the councillor Júlia Fernandes and journalists.

To accomplish the video mapping, nothing better than to use our very own Einstein VideoPlayer, which, with a few simple modifications, enabled us to project independent videos to multiple projection areas.

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Ut elit tellus, luctus nec ullamcorper mattis, pulvinar dapibus leo.

The event ended with a great surprise, it was offered an original Valentine’s handkerchief and a documentary book with all the handkerchief’s history and photos. Dozens of different handkerchiefs, all of them wonderful.

It was an epic event! The involvement of all the team behind Casa do Conhecimento, the sharing, the making, the joint creation of an object with immense potential.

Here are some clips from local newspapers online Terras do Homem and O Vilar Verdense.