Vimana Sequencer

Vimana Sequencer

Vimana is a personal interpretation of a Multi-track Midi Step Sequencer with an Open Philosophy regarding the User Interface

I have been working on this project during the last 4 years, and today, I’m presenting it to the world as it is right now.
Vimana is also my submission to the MIDI Awards 2022.

Below is a list of the main features that are implemented:

16 independent MIDI channels/tracks
Main clock, division/multiplication per track
Step parameters: pitch, velocity, gate, retrigger, repeat, chord, inversions, sustain
Play mode: forward, reverse, pendulum, random, drunk
Built-in Quantizer with many different scales, and also the possibility of setting a custom scale.
Euclidean Generator

Vimana stands upon an open philosophy regarding the physical user interface. As a developer and a musician, I wanted to be able to build multiple instruments according to the need of the project, or the moment. For this I have created a modular user interface from the physical point of view, but also from the software point of view. Let’s say that we need a step sequencer with 5 steps and 3 tracks only, or a gate sequencer for drums and rhythms with as many steps and pages and tracks, it is all possible with this system, it’s just a matter of changing variables and voila. CPU is the limit here, and since we have a Teensy platform as the brain, we have a lot of free ground to play with.

Credits:
Concept: Guilherme Martins
Electronics: Guilherme Martins, Tarquínio Mota
Software: Guilherme Martins, Tarquínio Mota
Beta Tester: Ricardo Webbens

In the video below I have an intrument ensemble in Ableton Live, and I’m using Vimana to play them all. 

Other instances and prototypes of Step Sequencers, testing different layouts and user interfaces.

The video below shows how it all started, the 1st prototype of a step sequencer. When I accomplish this concept, I knew I could create something way better.

Drone Engine

Drone Engine

I have an Axoloti core for a couple of years, I never gave good use to it. The time  finally came. I have been thinking in a Drone Machine, something that could be fun and simple to use, with a lot of ground to explore and play. On the other hand, I also want to build a generic physical setup, that allows me to change firmware and experiment with new sound objects and synth architectures.

The Axoloti Core is a bare bone pcb, built with a powerful DSP, and it is made to be hacked and customized. It has a great community, that has been creating a huge amount of amazing sonic objects since the beginning. I don’t think I have invented anything, I only connected ready made objects like the oscillators and the effects, it is that simple.

This is the physical setup, it is a Teensy LC with two groups of 16 pots that are connected to multiplexers. Teensy is sending MIDI CC messages to the Axo, and all the parameters are being controlled this way. The major drawback is the MIDI low resolution (127 steps), this is noticeable in the oscillators pitch. One thing to do in the future is to connect the pitch potentiomers directly to the analog input on the Axo.
The black PCB boards I’m using were developed during Artica’s days. Luckilly I still have a bunch of these boards with me. 

Since the Teensy is connected to the Axo via USB (Axo has USB host port), and I didn’t want to have the usb plug showing on the back panel for several reasons, the USB is soldered directly to the Axo’s PCB.

If I still need to program the Teensy, I can still use a modified USB cable that connects to this JST header.

The Axoloti program is very simple, basically there’s 2 oscillators based on Mutable Instruments Braids algorythms, and a third very simple oscillator. Each of the Oscillators has dedicated LFO’s to the filter, and the Mutable Oscillators has also LFO to modulate color and timbre.
Than all the voices are mixed and send to Delay and a Reverb.
If you have an Axoloti and want to experiment this synth, you can download the patch 
here.

With a MIDI keyboard this synth can be played normally. By the time of this video I had the 3 oscillators being affected by keys, but right now only one oscillator receives MIDI information, so I can still play around with the other 2 as drone voices. 

If you want to know more about how to make a custom MIDI controller check my tutorials about MIDI programming and generic IO processing.

Physical Computing Tutorials

Physical Computing Tutorials

This is my humble contribution, showing how to interface with physical stuff.
I’ve been developing physical interfaces for a couple of years, from robotics to electronic musical instruments and other physical devices to be used in interactive installations.
In this Youtube channel, I will be sharing experiments hoping they will be useful for someone.
Here’s a list of the tutorials I have made so far.  

Simple USB MIDI in 3 parts

Reading analog and digital inputs using Multiplexers in 3 parts

Reading analog and digital inputs using Multiplexers in 3 parts

Reading analog and digital inputs in 3 parts

Modular Physical Computing

Modular Physical Computing

I personally enjoy pressing buttons, turning knobs,  see and touch physical devices, and if they make any bleep or bloops it’s even better.

Building physical control systems is not always easy, speccially if we need them to be sturdy enough to throw on a backpack, or take them to an installation or stage performance. This project started in ArticaCC, during a research project named “Interact”. This enabled us to research ways to build interactive systems related to software and hardware. These modular systems are easy to solder and easy to interface with, and because they are totally generic they can be used with Arduino, Teensy, RaspberyPI or any other microcontroller of your choice.

All these modules have an open source hardware philosophy and have a dedicated repository on Github. Feel free to use them as you wish.

Spatial Sound Exploration

Spatial Sound Exploration

Exploring spatial soundscapes with body location.

This prototype is a proof of concept for location-based sound exploration.

Each participant/VIVE tracker has a sound attached, this sound follows the participant inside the room. If the participant moves to a corner the sound becomes more audible in that specific corner. This video shows a proof of concept in a small room, the same concept can be applied in any room size, or even in large venues.
Follow the video’s link for a full description.

Tangible Sequencer

Tangible Sequencer

Music Sequencers always pleased me, and I have been around this topic for a while
This time I have developed what I ended calling a Tangible Sequencer.
It enables physical interaction with the music, and also the possibility of multiple users. You might want to check the in-depth walkthrough.

Brain, Wider than the Sky

Brain, wider than the Sky

Brain, Wider than the Sky

Client
Fundação Calouste Gulbenkian

Curator
Rui Oliveira

Project leader, Interaction design, Software development
Neuron Sculpture detection and interaction
André Almeida

Rigging, Electronics, Setup
David Palma

Neuron Sculpture project leader
Neuron Sculpture vision, concept, sculpture
Eric da Costa

Interactive applications art direction
3D Models, UI design, Brain Orchestra Sound design
Neuron Sculpture Sound design
Filipe Barbosa

Time to Action Author Software development, Project Advisor
Gonçalo Lopes (NeuroGears)

Light equipment design and assembly, Neuron sculpture interactive sound software
Brain orchestra software developer
Guilherme Martins

Neuron Sculpture fibreglass
Neuron Sculpture assembly
João Ribeiro

NeuroGears, Neuron Sculpture detection software
Project Advisor
João Frazão

Neuron Sculpture fiberglass
Neuron Sculpture assembly
José Pedro Sousa

Interactive applications Software developer
Ricardo Imperial

Neuron Sculpture software development
Neuron Sculpture light/electronics development
Interactive applications software development
Tarquínio Mota

Acknowledgements:
Leonel Caldeira

Project full disclosure:
https://medium.com/artica/brain-wider-than-the-sky-f5d7720a5938

Toppi

Toppi, the Sweet Artist

End Customer
Nestlé Portugal

Agency
J. Walter Thompson Lisboa

Concept
J. Walter Thompson Lisboa

Production
Artica Creative Computing
http://artica.cc

Robot Concept and Extruder Design
Eric da Costa

Construction
Eric da Costa
José Pedro Noronha
João Ribeiro
André Almeida
Guilherme Martins

Software Development
André Almeida
Tarquínio Mota
Ricardo Imperial

Electronics
Tarquínio Mota
André Almeida
Guilherme Martins

Interaction Design
Nuno Lourenço
André Almeida

Sound, Motion Graphics, Robot animations
Filipe Barbosa
Nuno Lourenço

Project full disclosure:
https://medium.com/artica/toppi-the-sweet-artist-71491307aade