Computational Sound

Trying to understand and analyzing how sound is created using code in class

  1. Using Oscillator to generate melody at Diatonic Major scale

Based on my interpretation & understanding — these few code were key:

  //1 
  t+= random(1) > 0.3 ? 1 : 0.2;
  
  //2
  let r = (sin(t) + 1) * 4;
  
  //3
  let f = ratios[floor(r)] * BASE;

  1. time is increased by either 1 or 0.2

  2. calculate the sin wave as time passes and multiply by 4 to increase difference

  3. based on this calculated value, choose a scale to play at that frequency

  4. Modulo Drum

  //example 1:
  if (frameCount % beat == 1) {
    sounds[2].play();
    }
    
   //example 2:
   //Syncopated drum
  if (frameCount % beat == sb) {
    sounds[5].play();
    rect(x, y, 5, 15);
    sb--;
   
   //example 3:
   //phased drum
     if (frameCount % 50 == 1) {
    sounds[4].play();
    rect(x, y, 5, 15);

Example 1 : Playing sounds at different intervals according to frameCount. In this example would be every 60 frames (approx 1s)

Example 2: This was interesting to create an off rhythm drum sequence

Example 3: It plays at every fifth beat, which when matched with example 1 - it syncs at every sixth beat

  1. Loopy Drum
  for (let s = 0; s < 7; s++) {
    // Use multiply beat by s
    if (frameCount % (beat * (s + 1)) == 1) {
      sounds[s].play();

This was also interesting to create, as it plays a rhythm in the loop, varying the sound index it plays, according to the frame rate. I’m still not entirely familiar with the logic behind this as there’s too many variables to consider.

Based on the concept, I started to experiment combining different elements together. I decided to create Chord-Based Melodies — using guitar chords since there are lesser considerations, and probably the most friendly instruments. I found 6 different guitar chords online and experimented with them.