What would the bass look like? What would it be like to touch it and manipulate it directly and visually in real-time? These are some of the things I am trying to get at in this sketch.
I really wanted the the form to be more about the characteristics of the sound than something that only responds to the audio of the sound like what has become the standard fft based sound visualizer. For example, the filter is being modulated here and when the rate of the modulation increases so does the rate of vibration of certain aspects of the form. Easy to see… hard to explain.
I have created a spline based 3D form in Processing to represent the bass frequency and put it motion. The form is more about movement that anything else. It is constantly in flux.
I am not analyzing the audio here at all. I am only using the settings on the bass synth to drive the
visuals. I broke out certain “effective” parameters from the NI Massive bass patch and worked with those to define the visual characteristics of the entity. I am looking at an lfo modulation of the filter here, a bitcrush effect, a sample and hold effect, feedback, an oscillator phase parameter, and an fm synthesis control. This gives me a huge palette to work with. I should say I am also sending midi note information to Processing and having that vibrate the form as well.
Once the visual starts making sense then I can begin interacting with it and finding musical ways to engage it or play it like an instrument.
A single touch here is simple rotating the form in 3D. This seemed really important to me for some reason and that could easily change down the road but for the moment it seems like a really good idea to be able to navigate the form in 3D. It feels like it brings some kind of clarity to the whole thing that is hard to find otherwise.
When there are two touch points registered the distance between the two points controls the lfo modulation rate. In other words, when the fingers get closer together the wobble speeds up and when the fingers spread apart the wobble slows. After playing with this a lot of different ways… this just felt right. In this two finger mode I have also defined bitcrush and sample and hold effects. When the left touch point goes toward the bottom of the screen the bitcrush effect is turned up.
When the right touch point goes to the lower part of the screen the sample and hold effect is turned up. The relationship of these y positions is also effecting the feedback of the sound. These effects are also breaking up the visual and adding randomness to elements that make up the visual. I found it necessary to sort the points horizontally so that the leftmost point is always distortion even when the touch points cross. This is kind of hard to explain but this is important because otherwise you are constantly trying to figure out which finger is doing what.
This is a great moment for me. I really like the huge variety of sounds and visuals that I am getting with the interface in this mode. There is also this cool dramatic moment when both touch points slide to the bottom of the screen and things really melt down.
When there are three touch points I begin effecting oscillator phase and fm synthesis as well as feedback. This is still a little rough and not completely under control at this point. It is still a lot of fun to hear these effects and I need to figure out a more clever way to work with three touch points.
I love this sketch because it all about manipulating a sound in real-time visually. It is brutally simple in some respects but completely cool at the same time. I am not actually manipulating the notes here and I think that would be an improvement. I want to try and combine what is going on here with a touchscreen step-sequencer in a later iteration so that you will be manipulating the notes as well as the other characteristics of the sound. I am thinking about this as a split presentation but that could change also.