Following up on last week’s post about mO, we’ll take it a notch further today by adding a MIDI controller to our code.

mOd processing synthThe beauty of programming your own synth is that you can see the underbelly of the creation of such an instrument. You can understand better the MIDI signal, the synthesis of sound and music in general. I will keep the code fairly simple so that you can experiment to create your own synth and sounds. We will be using Beads audio, Processing and a keyboard MIDI controller of your choice. I will be demonstrating with a KORG nanokey since it’s my favorite tool to experiment with new soft synth. As long as you have a MIDI device it should work well. In future posts, we will look at how to use OSC and Arduino.

So we will separate the code into 3 steps: the MIDI integration, the audio creation and the creation of knobs that control the volume and overall pitch.

korg nanoKey

MIDI in processing

The first part we will create is the MIDI connection to Processing. To do that we will be using the MidiBus, a Processing MIDI library. It should be quite straight forward if your MIDI controller is recognized by your system.

import themidibus.*;
MidiBus myBus; // The MidiBus

void setup()
{
  // Code for the midi controler
  MidiBus.list(); // List all available Midi devices on STDOUT. This will show each device's index and name.
  myBus = new MidiBus(this, 1, ""); // Create a new MidiBus with no input device - you will have to change the input here
}

Running that code will create an output list of all your MIDI devices that processing can communicate with. It should look something like this:

Available MIDI Devices:
----------Input----------
[0] "Juli [hw:0,0]"
[1] "nanoKEY [hw:3,0]"
[2] "Real Time Sequencer"
----------Output----------
[0] "Juli [hw:0,0]"
[1] "nanoKEY [hw:3,0]"
[2] "Real Time Sequencer"
[3] "Java Sound Synthesizer"

From there you can see where is your device and change the code accordingly: myBus = new MidiBus(this, 1, “”); where you can replace the 1 with your device number.

BEADS AUDIO TO CREATE OUR SYNTH

Now that our MIDI controller is connected to Processing we can start creating a synth. You can refer back to the older post about mO and plO on this site, or you can simply take the code below:

import beads.*;
Gain g;

AudioContext ac;
WavePlayer wp;

void setup()
{
// Code for the synth
  ac = new AudioContext();
  wp = new WavePlayer(ac, 0, Buffer.SINE);
  g = new Gain(ac, 1, 0);
  g.addInput(wp);
  ac.out.addInput(g);
  ac.start();
}

void noteOn(int channel, int pitch, int velocity) {
  // Receive a noteOn from your midi device
wp.setFrequency(6.875 *(pow(2.0,((3.0+(pitch))/12.0))));
  // The calculation of the midi note to frequency is 6.875 * 2 exp ((3+note)/12)
  // I added one knob value in order to go trough all the note in the scale,
  // if your midi controller has a octave up and down that knob is not needed

}

void noteOff(int channel, int pitch, int velocity) {
  // Receive a noteOff - or releasing the note from your midi device
   wp.setFrequency(0);
}

Now we come to a very interesting piece of code, where you have to calculate the wavelength from a MIDI note. Your MIDI controller will send a message to your Processing with a note, from 0 to as far as your controller goes with 60 being your middle C. The problem comes when you have to translate the note into hz, the measurement of your frequency. Every note refers to a different frequency and the distance between these frequencies aren’t on a regular pattern.

I won’t go into detail on how to calculate it, and will simply give you the formula :

frequency = 6.875 X (2 exp(3 + midi note) / 12)

Which would be translated in Processing by:

wp.setFrequency(6.875 *(pow(2.0,((3.0+(pitch))/12.0))));

Now if everything went well you should be able to connect the 2 pieces of codes and have a MIDI controlled soft synth. You can see how the MIDI signal is sent and modifies the waveplayer value, which is not in the draw loop, but in it’s own function. That is an important point where you want to only process the MIDI signal when it happens (and not every time you loop through the draw). So far so good, wasn’t it simple?

Now, what is a synth without knobs and some graphical feedback! Lets add some knobs and the wave pattern. You can copy the whole code from here which include the MIDI, the synth and the control:

import controlP5.*;
ControlP5 controlP5;

import themidibus.*;
import beads.*;
Gain g;

AudioContext ac;
WavePlayer wp;

MidiBus myBus; // The MidiBus

Knob volume;
Knob level;

void setup() {
  size(400,400);
  background(0);
  smooth();

  //Code for the Knobs
  controlP5 = new ControlP5(this);
  volume = controlP5.addKnob("volume",0,0.8,0,40,30,120);
  level = controlP5.addKnob("level",0,2,0,230,70,80);

  // Code for the synth
  ac = new AudioContext();
  wp = new WavePlayer(ac, 0, Buffer.SINE);
  g = new Gain(ac, 1, 0);
  g.addInput(wp);
  ac.out.addInput(g);
  ac.start();

  // Code for the midi controler
  MidiBus.list(); // List all available Midi devices on STDOUT. This will show each device's index and name.
  myBus = new MidiBus(this, 1, ""); // Create a new MidiBus with no input device - you will have to change the input here
}

color fore = color(92, 169, 250);
color back = color(0,0,0);

void draw()
{
  loadPixels();
  //set the background
  Arrays.fill(pixels, back);
  //scan across the pixels
  for(int i = 0; i < width; i++) {
    //for each pixel work out where in the current audio buffer we are
    int buffIndex = i * ac.getBufferSize() / width;
    //then work out the pixel height of the audio data at that point
    int vOffset = (int)((1 + ac.out.getValue(0, buffIndex)) * height / 2);
    //draw into Processing's convenient 1-D array of pixels
    pixels[vOffset * height + i] = fore;
  }
  updatePixels();
}

void noteOn(int channel, int pitch, int velocity) {
  // Receive a noteOn from your midi device
  wp.setFrequency(6.875 *(pow(2.0,((3.0+(pitch*level.value()))/12.0))));
  // The calculation of the midi note to frequency is 6.875 * 2 exp ((3+note)/12)
  // I added one knob value in order to go trough all the note in the scale,
  // if your midi controller has a octave up and down that knob is not needed
}

void noteOff(int channel, int pitch, int velocity) {
  // Receive a noteOff - or releasing the note from your midi device
   wp.setFrequency(0);
}

void volume(float theValue) {
  g.setGain(theValue);   // the volume knob on the left
}

So at first we load the controlP5 library that we will use for the creation of the knob. We initiate the knob in the void setup and we then use the knob value in 2 places. We first use the right knob value in the frequency of the note, so when we trigger a note on our keyboard the value is divided by the knob value, in order to change the range of the frequency. This will change the octave in which you are playing.

The knob on the left is for the volume and you can read at the end of the code void volume(float theValue) a block of code that is triggered only when the volume knob change value. We could have inserted it in our main draw loop but it’s more memory efficient to have it on its own function.

The last detail is in the draw loop, where we simply add the wave shape for eye candy!

So now you have a functional soft synth controlled with your MIDI keyboard. From here, you can pretty much create any sounds – your imagination and coding skills are your only limits!

We will be adding some Arduino and knob action on the next synth so say tuned!! You can also download the complete code here: mOd

2 Responses

  1. I am not getting any sound when I play notes on my midi controller, and I copied your code exactly (I left out the controlP5 knob code, but it should still work because controlP5 is just for displaying knobs). MidiBus recognizes my keyboard when I run the list() method, but it’s not producing sound when I hit the keys.

Leave a Reply