Today I woke up dreaming about my final major project, it's safe to say I'm excited for the coming months and grinding work. Although that may sound sarcastic it's not! I got in early at the studio to play around with some Touch Designer. I believe this will be the most challenging part of my generative audio/visual environment. Since I want the image and the audio to be tightly synchronised it makes it even more challenging. I've also heard from my colleges on the Digital Music & Sound Art course that the MIDI interfacing is still in a beta stage and isn't easy to use. Another worry of mine is switching between different kind of visuals.
One way of creating some synchronisation between the audio is using audio to effect the visual, I did this in my practice 7 module. This was a team project and the visuals were out sourced. (More on that here). I did some of that today, and it was ok, you could tell the audio was having an effect but it was pretty delayed. I'll play around more with this since it could be an effective way of manipulating the visuals. Here is some examples of the work I did today. To get myself going I followed a Matthew Regan tutorial. Although I don't fully understand it yet. Link > here <
When I got home I worked between two tasks:
Setting up PureData to communicate with Logic
And setting up Quadraphonic in my bedroom (I definitely don't have the room for it)
At the beginning of my session I thought I was having some success with triggering MIDI in logic, I thought I was well on my way to getting it to work. I had some notes triggering a sampled marimba. I then set up my Quadraphonic system fairly quickly. It took me just under an hour. I then had ctlouts changing surround panner values. This was pretty effective. I up the metro and random object to give me values. And using the line object smoothed the transition from value to value. This moved the sound in the space very well. It inspired an idea for the colour yellow, I thought I could have to rhythmical elements chasing each other around the room. But I digress.
After getting the surround panning to work, when I was most hopeful, I hit what appeared to be an unbreakable brick wall. When I added in a second instrument channel and tried to input data from PureData many difficult bugs occurred. I could get one MIDI channel to play notes, but anymore than that and the notes cut in and out. Like they where phase canceling. But with MIDI notes. very weird. As well I couldn't send cltouts to more than one channel. After sometime I decided to quit and move back to Ableton. I still wanted the surround, so I followed a tutorial by Eric Kuehnl, which shows you how to get surround in Ableton 9. View > here <This was effective, Although I didn't get around to programming any movement from PureData. That is on tomorrows agenda.
Written by Jack Cleary
Today I worked on the lighting for my final major project at The University of Brighton. The Digital Music & Sound Art course has a DMX lighting installed in one of the studios. Using PureData, the technicians were able to control the colours and intensity of the light via the computer.
I plan on using generative processing with PureData to change the colours of the lighting. PuraData will also generate the music and visual (powered by TouchDesigner). All three elements will be used to create a variety of moods that fluctuate for theoretically infinity. Some good progress was made to day in setting the values, next time I work I will be randomly generative movement to the lights. Watch >here<.
Another thing I wanted to talk about is some inspiration. I was listening to Al Gromer Khan and Amelia Cuni's Collaboration Monsoon Point, whilst having a bath. Perfect music for relaxing to. The music complemented the sensation of the hot water surrounding my body. The experience is euphoric, almost like tripping, the heat makes your heartbeat faster and stronger. This becomes a pulse that you feel through your body, thus becoming rhythmical element to the music. After sometime you start to hallucinate, light flickers in a way you can't describe. Amelia's voice was especially breathe taking, its soothing and other worldly. There something about the experience of the path I would like to capture in the piece. I imagine it would fit with orange lighting and some pulsating visual and sound. I will record some vocals and generate them in the patch. I'll do this by placing audio clips in a sampler in logic, generating MIDI notes from PureData and sending them to the sampler.
Wrote by Jack Cleary
I've been thinking of the concept for my final major project a lot the last few days. And how it will be perceived by the general audience. There are two essential pillars to my idea. The infinite and fluctus which means wave.
The infinite is expressed through the near limitless combination of generative compositional states. A compositional state, is a concept that I'm testing. Its purpose is to describe a musical environment, which can be encapsulated within a few words. For example, a piece of music may sound busy and harmonically exciting. These characteristics describe the state in which the music is in. My idea is to create 24 of these compositional states. A state will be able to transition between 8 other states. This form creates a near limitless amount of different combination of compositional states. Each state will also have an element of randomness to it, which means the amount of variation in the installation is infinite.
The next layer of the concept is waves. I was inspired by a Michael Pierce lecture that I've linked below. In the talk Pierce talks about Heraclitus' two many concepts, eternal flux and eternal logos. Without going into much detail, things are constantly in flux, but there are governing by rules which is the logos. This is a central idea in my pieces, the music never repeats itself exactly, but the music is governed by a set of rules.
On a less philosophical level, the wave aspect is also connect to the medium. Sound and light are the two elements that will change from state to state. In physics both sound and colour are waves.
Through my thoughts I've been able to summaries the project as a :
Generative audio/visual installation with near limitless combination of compositional states.
I'm becoming aware that I need to start conceptualising the different compositional states, so I'll post some material on that in the near future.
written by Jack Cleary
At the moment in time I am working on my Final Major Project for university. The project is a generative audio/visual installation, which uses brightness and colour as a compositional tool. Both the audio and visual can be described as bright and dark. Both the colour wheel and circle of fifths have a bright and a dark side. This idea was inspired by Adam Neely's video "Why is major "Happy" (Link to Video). I intend on using brightness, colour and sound to create moods. Colour will symbolise emotions, such as blue being sadness/depression etc. Sound will act to support the visual. The generative processes will be realised through PureData, which will control the audio visual.
John Cage is an influential figure in Avant-Guard composition. Chance Operation is a defining feature of his music. His work has influence many artists and me personally. In my first year at Brighton University on the Digital Music & Sound Art course I created the composition "Magnetic Soundscape". In the composition I use chance operation to determine what actions to take.
In my latest work for my final major project I use the same idea of chance. But instead this time I use PureData (a node based coding software) to randomly generate MIDI data. This data is then sent to Ableton, triggering notes and control values.