Radar For Interactive Notation System, [any number of instruments] (2020)
Tap Dripping Beats [study] (2020)
This little study is an attempt to construct an electronic piece in real-time by improvising using only modular synthesizers.
Ebloki Live Audiovisuals [SuperCollider + Processing] (2014)
“Acousmatic”, I don’t call it. I have always wanted to make an electronic piece where the sound sources are revealed to the audience. Here I am using audio/video files which are sequenced live based on some rules. Apart from the overall structure which is predetermined, there’s significant amount of unpredictability involved in every performance. I found interesting to work with audio/visual snippets which succeed fast one another, often close to the threshold of what the eye can comfortably identify as isolated events. In the time-domain the ear proves to be more intelligent than the eye, having far greater resolution.
The interaction between painting and sound in a physical way is the primary concern of the current work. The audience is free to examine 2 paintings from any distance. Apart from the obvious visual aspect, the artwork has a hidden sonic aspect as well. For every painting there has been constructed a sound-sculpture that corresponds to the particular drawings. The viewer becomes a listener by touching the artwork. Any movement made along the horizontal axis, which is closer than 2cm from the surface of the painting, is traced by infrared sensors and will reveal the particular sonic entities that correspond to the exact position of the painting. By removing the hand from the drawings the sound stops immediately. There is always a one to one relationship between the visual and its sonic representation and vice versa. That is, for every gesture of each painting there is a hidden soundworld that waits to be discovered by tactile interaction, hence the artwork is considered unfinished without the contribution of the audience. The way the system has been designed allows the user to unfold the piece of music at his/her own pace, probably also discovering elements through sound that the eye overlooked.
Absolut Athens Live Electronics (2009)
On a Saturday night in early December 2008 a Greek policeman shot dead a fifteen-year-old student in the centre of Athens. It was the spark for the riots that broke out throughout the country and shook Europe. The current piece was put together in SuperCollider within January shortly after the major events using real sounds from the demonstrations. The decision to use a programming language to randomly select and schedule the sonic events is straightforwardly related to the nature of the piece that imitates the behavior of an autonomous crowd, trying to convey the anger of the riots. Even though it incorporates significant amount of randomness, the piece is composed in a way where its overall form is fixed. There are simply ‘better’ or ‘worse’ performances. The duration of the piece (ca.3:30) is also indicative, since it is generated by code the same structure can be easily extended or diminished.
All the images and most of the audio recordings have been fished from the Internet. Unfortunately, I am unable to trace down the original sources anymore. Any use is for non-commercial purposes. Some code is based on an idea by Julio.
Transduction #2 Networked Laptop Trio, Visuals and soundfiles (2008)
networking by Renaud, visuals by McClelland
[watch a rehearsal video]
Εγκατάσταση πινακίδας (2011) // γωνία Πατησίων & Τοσίτσα (παρέμεινε 4 μέρες)