low frequency entity

Monday, August 10th, 2009

What would the bass look like? What would it be like to touch it and manipulate it directly and visually in real-time? These are some of the things I am trying to get at in this sketch.

I really wanted the the form to be more about the characteristics of the sound than something that only responds to the audio of the sound like what has become the standard fft based sound visualizer. For example, the filter is being modulated here and when the rate of the modulation increases so does the rate of vibration of certain aspects of the form. Easy to see… hard to explain.

I have created a spline based 3D form in Processing to represent the bass frequency and put it motion. The form is more about movement that anything else. It is constantly in flux.


I am not analyzing the audio here at all. I am only using the settings on the bass synth to drive the
visuals. I broke out certain “effective” parameters from the NI Massive bass patch and worked with those to define the visual characteristics of the entity. I am looking at an lfo modulation of the filter here, a bitcrush effect, a sample and hold effect, feedback, an oscillator phase parameter, and an fm synthesis control. This gives me a huge palette to work with. I should say I am also sending midi note information to Processing and having that vibrate the form as well.

Once the visual starts making sense then I can begin interacting with it and finding musical ways to engage it or play it like an instrument.

A single touch here is simple rotating the form in 3D. This seemed really important to me for some reason and that could easily change down the road but for the moment it seems like a really good idea to be able to navigate the form in 3D. It feels like it brings some kind of clarity to the whole thing that is hard to find otherwise.

When there are two touch points registered the distance between the two points controls the lfo modulation rate. In other words, when the fingers get closer together the wobble speeds up and when the fingers spread apart the wobble slows. After playing with this a lot of different ways… this just felt right. In this two finger mode I have also defined bitcrush and sample and hold effects. When the left touch point goes toward the bottom of the screen the bitcrush effect is turned up.

When the right touch point goes to the lower part of the screen the sample and hold effect is turned up. The relationship of these y positions is also effecting the feedback of the sound. These effects are also breaking up the visual and adding randomness to elements that make up the visual. I found it necessary to sort the points horizontally so that the leftmost point is always distortion even when the touch points cross. This is kind of hard to explain but this is important because otherwise you are constantly trying to figure out which finger is doing what.

This is a great moment for me. I really like the huge variety of sounds and visuals that I am getting with the interface in this mode. There is also this cool dramatic moment when both touch points slide to the bottom of the screen and things really melt down.

When there are three touch points I begin effecting oscillator phase and fm synthesis as well as feedback. This is still a little rough and not completely under control at this point. It is still a lot of fun to hear these effects and I need to figure out a more clever way to work with three touch points.

I love this sketch because it all about manipulating a sound in real-time visually. It is brutally simple in some respects but completely cool at the same time. I am not actually manipulating the notes here and I think that would be an improvement. I want to try and combine what is going on here with a touchscreen step-sequencer in a later iteration so that you will be manipulating the notes as well as the other characteristics of the sound. I am thinking about this as a split presentation but that could change also.

12 comments so far Add Your Comment
  1. Wicked! Lovely work.

  2. Hi there,
    Amazing! Not clear for me, how offen you updating your http://www.bannister-design.nu.

  3. I have been doing this work a lot longer than I have been blogging. I am hoping to update here whenever there is some kind of breakthrough or new development with this project.

  4. Fantastic stuff. When ya takin orders :D blogged over at AudioLemon.

  5. Nice work!!! Very expressive. Can’t wait to see more.

  6. Very intriguing. Could be the coolest synthesizer interface yet. Your multitouch gestture control is really quite interesting to watch. I’ll definitely try to keep tabs on your progress.

  7. WOW!!! Lately I’ve been trying to conceptualize new ways to control bass wobbles other than the traditional knob/mod-wheel cranking and this much more cool and intuitive than anything I have come up with. I can’t wait to have an interface like this. Great work!

  8. absolutely WHHHUM !
    are you coming in Europe ??
    what is the screen device ? is that something we can buy on the market ??

  9. what is the software you are using to create the control applications? is it max5? the objects you are using seem very similar to the ones in max5. Do you have these objects on top of jitter? Or are you using another visual aid application?

  10. In this sketch I am using Massive and Logic for the audio and Processing for the visuals. I used Max/MSP as the glue that maps and maps/converts UPD to midi.

  11. [...] very appealing spline based 3D form in Processing that represents the bass frequency and puts it into motion.As Christian Bannister [...]

  12. Yo I live in Mcminnville OR how much $$ would this kinda setup cost me, would you be willing to help me with config. Im sure you get this a lot. you can email me at fruityloop@mail.com Im realy into this I kinda want to add this to my live D&B set up. I love the work of KJ sawka and would just die to add a mulit-tuch, live, editing lab such as this to my setup. I thnk that the live production is the now of electronic music. Thanx.