All posts tagged with 'hard/software' sorted by date.
An algorithmic musical instrument designed for performing electronic music live.
About three years ago I set out to design a electronic instrument that would fulfill all my music creation and performing needs. Eventually this became the PolyPulse. Recently, I've been able to start using the PolyPulse in live performances:
Software to stream video feeds from unprotected IP camera's from all over the world to a Raspberry Pi.
The system streams from a database of IP camera's which are not protected by a password. In a continuous loop the Python script is selecting an IP address from the database and streams for about half a minute. After that the stream is closed and another random stream is selected. The script runs on Raspberry Pi's requiring just power and a connection to the internet. The scripts starts on boot and keeps track of which IP addresses actually still work and stores those in a so-called curated list.
The images as streamed from these IP camera's are often some form of surveillance like in hotels, beaches, offices, backyards or marinas but sometimes also industrial equipment or panoramic views.
The Automatic Stream Retrieval System was developed for the third iteration of panOptical by Roel Weerdenburg. The images presented on this page depict the Automatic Stream Retrieval System used as part of panOptical as presented at Ars Electronica 2020 and GOGBOT 2020.
An instrument commissioned for a music theater play which plays various sounds when a ball rolls past the sensor.
The instrument, specifically designed for the music theater play 'Sounds Like Juggling', can be attached to a large ball run and plays various samples based on the size of the ball.
The system uses a distance sensor to detect and measure balls, and sends a message over WiFi to a Raspberry Pi which plays the samples. Different samples can be loaded into the device and the sensor has two sides, each with a different set of samples assigned. Both sides of the sensor can be individually disabled and the sensor runs off an powerbank fitted in the handle.
Several electroacoustic instruments have been developed for this play. In the process of designing instruments for the play I provided consultancy on technical matters.
Sounds Like Juggling is a production by Arthur Wagenaar and Guido van Hout, also featuring Dianne Verdonk, Aleš Hrdlička and Joeri Vos.
Partners: Stichting Goed Bezig! with Veenfabriek, Gaudeamus Muziekweek, Korzo, Werkplaats Diepenheim.
A part physical, part digital standalone performance instrument for live improvisation. The core principle of the instrument is transforming the audio signal inside a feedback path.
Vibrations picked up by a piezo element are analyzed by an algorithm controlling an oscillator. The oscillator is output via a speaker which in turn sets the spring in motion. With direct control over parameters like the oscillator range and delay time the seemingly unstable feedback can be guided to specific frequencies and sounds.
The instrument is played by physically interacting with the spring, manipulating the toggles controlling signal flow and the faders mapped to parameters of the algorithm.
A secondary signal path is sent to external amplification with a processed version of the signal as created by the feedback path. This path adds a contrasting layer to the local and raw sound of the feedback spring. Ideally the secondary signal is spatialized with a multispeaker setup in a large geometric shape around the performer.
Performances played with Pandora’s Box are improvised in nature but follow one principle. Metaphorically Pandora’s Box opens halfway through the performance. At that point the sound is not constrained by the single speaker inside the box and is spatialized in the surrounding space with some form of external speakers.
For me, the key value in improvising is having fun and surprising people with a relatively simple looking instrument.
An installation in which people are invited to join in creating together. While listening to the stimulating music people gaze into a detailed network of threads and add to it using the spool of thread they were given.
The project is part of the PQ Waltz, an initiative by Trudi Maan, Henny Dörr and Anne Habermann where the international community was invited to slow travel to the The Prague Quadrennial of Performance Design and Space.
t r e e s t a r t i m e was conceptualized and realized by Ward Slager, Ivo Smit, Jitte van Veen and Pleun Verhees under supervision of the University of the Arts Utrecht. The project includes music by Jitte and Ward, texts by Ivo, clothing by Pleun, a website and a timelapse video by Ward.
Article cover photo by Eva Neužilová
An interactive installation built for the 2 Turven Hoog toddler-festival. A physical and sonic landscape built with large card-board cylinders.
Buizerd encourages both the toddlers and the parents to explore how things interact, feel, sound or vibrate. The explorer is rewarded for their explorations with auditive and visual feedback.
The installation has four sound sources which react to the explorer using pressure and light sensors. Three speakers are hidden inside the cylinders and one surface transducer is attached to a lid. One of the taller cylinders has LED lights inside which react to sound. The cylinders have different finishes: artificial grass, yoga or rubber mats and tied rope.
Designed and built together with:
A generative musical instrument with a complex mapping interface based on user made presets made together with Fedde ten Berge. The synthesis algorithms generate patterns with a pseudo random quality that serve as a distinct musical language.
The instrument has six tracks of sound generation using the Pseudo Random Pulse algorithm written by Fedde ten Berge. I've reïmplemented the algorithm in C++ to run it on a Bela. Together we've expanded upon the algorithm and added a pulse looper to record and play back patterns. The recorded patterns can be quantized by a certain percentage to blend between the algorithms irregularity and and the quantized regularity.
With RGB LED encoders and a LCD display you can create sounds which can then be stored on one of the four corners of a joystick. Each track has it's own joystick allowing for seamlessly morphing between complex sounds.
12 CV inputs can be used to override the X and Y axis of each joystick. An ethernet connection listens for OSC messages containing a new set of notes. Together with the 6 clock CV outputs this enables easy intergration with other gear like a laptop running custom software projects built-in Max/MSP, Pure Data, Super Collider or it can be hooked up to your nuclear launch systems
Lastly is has audio synced spatialisation control for The Pentacle 15.3 system so that each new note can be placed in a different location in the room.
The instrument was used in performances at:
PRP algorithms & sound design
Fedde ten Berge
Embedded software implementation
Casing & hardware
Fedde ten Berge & Ward Slager
A live coding performance at EKKO Utrecht using Puredata.
HKU // EKKO // Uncloud // FIBER // Creative Coding UtrechtOct 2018
A network installation where multiple computers together play a single song. The visuals are a reference to the stereotypes surrounding computers, hacking and other cyberwizardry.
All source files used in this project can be found on GitHub
A paint tool concept where pixels near the cursor move and leave trails of bright colors behind.
Click here to try it out yourself!
An interactive multispeaker sound installation which uses activity sensors to generate soundscapes. The activity as tracked by the webcam guides the system through several 'stages of intensity' in the soundscape.
The activity is tracked by comparing videoframes using Processing. The data is then sent to Puredata which synthesizes the dreamy dark soundscape. The soundscape is played back over 4 speakers allowing movement in a certain area to be reflected in the soundscape.
The installation was setup at the entrace of the University of the Arts Utrecht.
All source files used in this project can be found on GitHub
A Python command-line application which generates midi drumbeats in exotic time signatures. You choose a BPM, time signature and the density of drumhits and it generates a beat!
The application allows you to input various parameters like BPM, time signature and the drumkit used to preview the generated drumbeat. These drumbeats can then be exported as midi files to be used in your favorite DAW!
The algorithm which generates the beat works like this:
Before the algorithm can begin, it first has to now certain things.
- Which drumkit is used to preview the beats (this does not affect beat generation)
- How many triggers long is the beat?
- How many of those triggers fit in one quarter note?
- At what BPM will the beat be generated?
Finding the 'important beats'
The algorithm will first decide which hits are important to ensure the beat actually feels like a beat. It does this by choosing one of tree options from the correct row. The result of that is then shuffled.
Let's say the algorithm chooses [3,2,2] because we chose 7 triggers per measure. This will result in an rhythm consisting of the following triggers:
Now if we shuffle the results the following variations can be made:
|x||-||-||x||-||x||-||3 2 2|
|x||-||x||-||-||x||-||2 3 2|
|x||-||x||-||x||-||-||2 2 3|
Assigning the 'important beats'
Now that we've selected the important beats they are distributed over the kick and snare drum sequences using the following chances:
|20%||Both kick and snare|
Fine tuning the results
Before generating the hihats it goes through a few more actions:
- Add a kick at the first beat.
- Remove any snares from the first beat.
- Check if there are any snares at all
- Insert a snare at the last important beat if no snares are found
- Insert a random kick after each kick that was not preceded by another kick.
Generate the hihats
The algorithm will now generate the hihats by either:
- randomly adding a hihat or kick after each trigger without kick or snare
- adding a hihat after each trigger with a snare
Now that generation is done the beat will start playing using the built-in drumkits. The user has options to regenerate, write to midi or change generation parameters.
The source code can be found on GitHub
A free stereo delay Max4Live plugin. The delay allows mixing the left and right channel feedback paths. Each channel also has delay time modulation and diffusion.
An electronic track composed entirely in Supercollider. Several sounds are shaped by randomness which makes every playback slightly different!
The source code can be found on GitHub