Pattern and Sound Series - P5js and Tone.js

Using p5js and Tone.js, I created a series of compositions that reconfigure temporarily on input from Kinect, a motion sensor, and generate soothing and energizing synthetic sounds that match the visual elements of the piece. The people in front of the projected visuals and sensor, in effect, create sonic compositions simply by moving, almost like a conductor in front of an orchestra but with more leeway for passive (if one simply walks by) or intentional  interaction.

The sound is easy to ignore or pay attention to. It is ambient and subtle. 

The next steps are to create more nuanced layers of interactivity that allow for more variation of manipulation of the sound and the visuals. Right now I am envisioning the piece becoming a subtle sound orchestra that one can conduct with various hand movements and locations of the screen. 

Composition 1: Ode to Leon's OCD

In the editor.

Composition 2: Allison's Use of Vim Editor

Mouseover version for testing on the computer.

In the editor using Kinect.

Composition 4. "Brother"

In the p5js editor.

For this synchopated piece I sampled sounds of tongue cliks and and claps found on freesound.org. No tone.js was used in this piece.  

Composition 5. "Free Food (in the Zen Garden)"

For this composition I used brown noise from Tone.js. 

In the p5js editor.

After using mouseOver for prototyping, I switched over to Kinect and placed the composition on a big screen to have the user control the pattern with the movement of the right hand instead of the mouse. 

 

Sound

I'm uising Tone.js, written by Yotam Man.

Visuals

P5js and Adobe Illustrator.

 

Inspiration

The simple grid-like designs I am creating are inspired by Agnes Martin's paintings. 

Minimalist and sometimes intricate but always geometric and symmetrical, her work has been described as serene and meditative. She believed that abstract designs can illicit abstract emotions from the viewer, like happiness, love, freedom. 

 

Code

Composition 1: "Ode to Leon's OCD" using mouseover code.

Composition 2: "Allison's Use of Vim Editor" with Kinectron code, using Lisa Jamhoury and Shawn Van Every's Kinectron app and adapting code from one of Aaron Montoya Moraga's workshops.

Composition 2 with mouseover for testing on computer.

Composition 3: "Isobel" with mouseover.

Composition 4 "Brother" with mouseover.

Composition 5. "Free Food (in the Zen Garden)"

Data Sonification - Earth's Near Deaths

Project for week 4 of Algorithmic Composition. by Nicolás Escarpentier, Camilla Padgitt-Coles, and Katya Rozanova,

 

Overview

Today our group met up to work on sonifying data using Csound. At first we were planning to build on the work we did on Sunday, where we created a Markov chain to algorithmically randomize the "voice" or lead flute sound from a MIDI file of "Norwegian Wood" over the guitar track using extracted MIDI notes and instruments created in Csound.

Our plan for the data sonification part of the assignment was to also take the comments from a YouTube video of the song and turn them into abstract sounds which would play over the MIDI-fied song according to their timestamp, using sentiment analysis to also change the comments' sounds according to their positive, neutral or negative sentiments. However, upon trying to implement our ideas today we found out that the process of getting sentiment analysis to work is very complicated, and the documentation online consists of many forums and disorganized information on how to do it without clear directives that we could follow.

While we may tackle sentiment analysis later on either together or in our own projects, we decided that for this assignment it would suffice, and also be interesting to us, to use another data set and start from scratch for the second part of our project together. We searched for free data sets and came across a list of asteroids and comets that flew close to earth here (Source: https://github.com/jdorfman/awesome-json-datasets#github-api).

We built 9 instruments and parsed the data to have them play according to their 9 classifications, as well as their dates of discovery, years of discovery, and locations over a 180 degree angle, as well as  each sound reoccur algorithmically at intervals over the piece according to their periods of reoccurrence. We also experimented with layering the result over NASA's "Earth Song" as a way to sonify both the comets and asteroids (algorithmically, through Csound) and Earth (which they were flying over).  The result was cosmic to say the least (pun intended!)

Here are the two versions below.

 

Python script

By Nicolas Nicolás Escarpentier found here.

For each asteroid or comet on the file, we extracted some common characteristics to set the sound parameters. The most important aspect is to portray how often they pass near the Earth, so the representation of the time has to be accurate. We set an equivalence of one month = 5 seconds and a year multiplier of 12 months, in case we wanted to make a longer year to introduce longer periods of silence on the score. The audio file starts on Jan 1, 2010 - the earliest year from the acquired data set. Each rock's discovery date sets its first occurrence on the score, and each occurrence repeats itself according to its period_yr (except for the 'Parabolic Comet', which doesn't have a return period).

month_interval = 5. # in sec
year_mult = 12 # multiplier (how many months in a year)

for a in aster_data:
    # get raw data
		datetime = dateparser.parse(a['discovery_date'])
		yea = datetime.year       # starting time
		mon = datetime.month      # starting time
		day = datetime.day        # starting time

		# first occurrence (starting in 2010)
		start = ((yea-2010)*year_mult + mon + day/30.) * month_interval

		# recursion
		start += recur *year_mult

For the other parameters, we selected characteristics that gave us some expressive possibilities. The pitch of each rock is based on the orbit's angle (i_deg), the instruments are based on the orbit_class, the duration on the q_au_1 (which we have no idea what it actually represents). For the scale of this score, we chose a minor B flat, in reference to the sound of a black hole and the "lowest note in the universe".

Instruments

Camilla, Nicolas, and I  created nine instruments using CSound.

The first three corresponded to the three most common occurring meteors and asteroids. These are subtle "pluck" sounds. A pluck in CSound produces naturally decaying plucked string sounds. 

The last six instruments consisted of louder, higher frequency styles.
Instrument four is a simple oscillator. 
Instrument five, six, and eight are VCO, analog modeled oscillators, with a sawtooth frequency waveform. 
Instrument seven is a VCO with a square frequency waveform. 
Instrument nine is a VCO with a triangle frequency waveform. 

linseg is an attribute we used to add some vibrato to instruments 6 - 9. It traces a series of line segments between specified points. These units generate control or audio signals whose values can pass through 2 or more specified points.

Each instrument's a-rate takes variables p4, p5, and p6, (which we set to frequency, amplitude, and pan) that correspond to values found in the JSON file under each instance of a meteor/asteroid near Earth. The result is a series of plucking sounds with intermittent louder and higher frequency sounds with some vibrato. The former represent to the more common smaller meteors and asteroids and the latter represent the rare asteroid and meteor types. 

Meteor Art by Meteor art by SANTTU MUSTONEN , which I manipulated using Photoshop.  Accidental coding poetry by Nicolas Pena-Escarpenier.  Photo by me.

Meteor Art by Meteor art by SANTTU MUSTONEN , which I manipulated using Photoshop.  Accidental coding poetry by Nicolas Pena-Escarpenier.  Photo by me.

Description of our code by Nicolás E. ~ See the full project on GitHub here

Description of our code by Nicolás E. ~ See the full project on GitHub here

IMG_1727 (1).JPG

AI-generated voice reading every comment on 2-hours of Moonlight Sonata on Youtube + Reading amazon reviews of Capital vol 1

Code is on github

In this project for Detourning the Web at ITP, I used Selenium to program a script that uploads strings from JSON files into Lyrabird that was previously trained with my own voice. I based my code off of the code that Aaron Montoya-Moraga's wrote for his automated emails project.

For the 2-hours of just the first, most recognizable, part of the Moonlight Sonata I had Lyrabird go through all the comments for two hours, read each one in my voice, and download the MP3 files (in case I wanted to use them for something else). The Moonlight Sonata plays in the background. I find the comments interesting here because 

For Capital Volume 1 I had Lyrabird simply read reviews it's Amazon reviews one by one.  I'm addition to exploring online communities and how they use products, art, or music as a jumping off point for further discussion and forums for expressing their feelings and views.  I'm interested in the way Amazon reviews democratize public book reviewing for better or for worse, mostly better I think. 

The pieces are both meant to be performed live. The screen recordings below show what the experience is like. 

Ram Bearer in P5JS (with sound)

I had mocked up this ram-bearer and ram combo quickly in Photoshop using the timeline function of Photoshop just to get a feeling for placement. I then created the above animation above using the PNG files with transparent backgrounds and the translate function.

Here is the gif animation I had created initially:

The sketch for the idea came out from an animation timeline i created for Animation class:

Statue.jpg

 

Why buttons

Initially I wanted to drag and drop the images but I had trouble with the code for that one so opted to make buttons. My thought was to make them transparent and just have the user click the heads to move them, but I realized I prefer the web2.0 aesthetic of the buttons floating right over the image. It breaks the seriousness of the emotions that could be evoked by the sculpture (uneven power dynamics, helplessness, etc.) and creates a feeling of agency and ease, as if to say "Wouldn't it be nice if changing power dynamics could be as easy as pressing a button!"

 

Sound

Another use for the buttons is the sound I tied to them. For the lamb button I used the Rhinoceros sound and for the man button I used the Screaming Alien sound. Both are from FreeSound.org.

 

Code

var imgBase; // Declare variable 'imgBase'.
var imgLambHeadUpright;
var imgLambDownSleeping;
var imgLambDownOpenEyes;
var imgLambLookingRight;
var imgLambLookingAtMan;
var imgBlank;
var imgHeadMan;
var i;
var lambState = [];
var x;
var y;
var xLamb;
var yLamb;
//var buttonMan;
//var buttonLamb;
var rhino;


function setup() {
  createCanvas(540, 730);
    rhino = loadSound("350424__benjaminharveydesign__rhinoceros-trumpeting-angry.wav");
    blabber = loadSound("188592__alex-audio__screamalien-norberd (1).wav")
x=0;
  xLamb= 0;
  y=height / 8;
  yLamb= height / 8;
  imgBase = loadImage("bodies_noheads_base.png"); // Load the image
  imgLambHeadUpright = loadImage("lamb_head_upright.png");
  imgLambDownSleeping = loadImage("lamb_head_sleeping.png");
  imgLambDownOpenEyes = loadImage("lamb_head_open_eyes.png");
  imgLambLookingRight = loadImage("lamb_head_looking_to_right_softer.png");
  imgLambLookingAtMan = loadImage("lamb_head_looking_at_man.png");
  imgBlank = loadImage("blank.png");
  imgHeadMan = loadImage("head_man.png");

  var i;

  lambState[i] = [imgLambHeadUpright, imgLambHeadUpright, imgLambDownOpenEyes, imgLambLookingRight,
    imgLambLookingAtMan, imgBlank];


  buttonLamb = createButton('Lamb');
  buttonLamb.position(420, 220, 4, 4);
  buttonLamb.mousePressed(nextLambState);

   buttonMan = createButton('Man');
  buttonMan.position(180, 230, 4, 4);
  buttonMan.mousePressed(nextManState);


}


function nextLambState() {
 

 //moving on the horizontal axis
  xLamb = xLamb -30;
  // Moving up at a constant speed
  yLamb = yLamb -1;
  if (rhino.isPlaying() == false) {
      console.log("play!");
      rhino.play();
    }
  }


function nextManState() {
 
 
  // moving on the horizontal axis
  x = x -30;
  // Moving up at a constant speed
  y = y - 14;
  if (blabber.isPlaying() == false) {
      console.log("play!");
      blabber.play();
    }
  
  
  }

function draw() {
  
   
 background(10, 245, 95);
// background(245);
//   background(224, 254, 75);

  // Displays the image at its actual size at point (0,0)
  //image(imgBase, 0, 0);
  // Displays the image at point (0, height/2) at half size
  image(imgBase, 0, height / 8, imgBase.width / 1.7, imgBase.height / 1.7);
  image(imgLambLookingRight, xLamb, yLamb, imgLambLookingRight.width / 1.7, imgLambLookingRight.height / 1.7);
  image(imgHeadMan, x, y, imgHeadMan.width / 1.7, imgHeadMan.height / 1.7);

  //invisible button for man head options
  // noStroke();
 // fill(255, 255, 255, 0);
 // ellipse(260, 200, 160, 160);


  //invisible button for lamb head options

  // noStroke();
  //fill(255, 255, 255, 0);
  //ellipse(420, 220, 160, 160);
}
 
 

Struggles and Potential Changes

Struggle  1

I like how this experience turned out but would like to, in addition, if time allows, explore my initial idea. 

Initially I wanted to create buttons that switched the lamb and man head positions (there are several PNGs, some with the head positioned a certain way, some with eyes closed). I think it would be fun to make a mix and match situation. Perhaps the user wants the sculpture to have no heads at all, or maybe they fancy that both heads are of the lamb, or both of the man. I tried using the display and hide functionsbut to no avail. Will look into it for future projects. 

I also tried using arrays in order to change the state of the elements on mouse clicks, but i could not conjure up the images when I called on these arrays. The only way I managed to have images show up was using the image command, which is not dynamic. I tried printing to see if the array is even "taking" the image I assigned to it but it looks like maybe the array is empty:

 

 

Here are some of the permutations, some with eyes open and some closed that I would have liked to appear for the mix-and-match version of this work:

 

Struggle 2.

I wanted to add psychedelic colorful strips to the back of the png images but when I did, they made the png images disappear. I could add the boxes on top, however, which looked cool but not what i wanted. 

Screen Shot 2017-11-13 at 8.00.10 AM.png

 

Inspiration and sources

I used a photograph of the greek Kriophoros - Lamb Bearer - statue. There are many similar ones across cultures but this one appealed to me because it's the most classic for the Western world and I feel more comfortable defacing it. This is probably why Terry Gilliam use classic statues from antiquity also. Terry Gilliam was the inspiration here - the slapstick, irreverent, over the top aesthetic is borrowed from him. 

It's possible that Jodorowsky's themes of the slave becoming the master, or the son killing the father to take his place, have also played a role. Both Jodorowky and Gilliam are pretty playful and this is why I like their work. 

Frenemies in the Greek Gallery/Comoediae Agni

Museum Frenemies/Ram Bearer's Comeuppance/Comoediae Agni

 

Scene 1. "brevis victoria"

The original sketch was for scene 1: Lamb bearer's comeuppance. The title became Brief Victory when I added the final part of the scene that dethrones the lamb from the man's body. 

Statue.jpg

Additional end of scene 1 where the lamb realizes it's victory was short.

sketch_for_panning_4th_scenario.jpg

 

Scene 2. "speranza. giocare"

Lamb grabs man's head and shakes it with satisfaction in teeth like a dog that caught a squirrel or a dog toy. Wags it's tail with glee. This continues. 

Scene 3. "exhalationem spiritus"

Lamb blows at man's head "TBTBTBB" playfully. The head simply falls off.  Possible that the bearer sculpture was never alive like the lamb sculpture. Did the lamb ever have a friend or foe? It vomits in despair or possibly let's out it's spirit. At the end of the scene the wings and the trumpet are uplifting. 

 

sketch_for_panning_3_scenarios.jpg
Screen Shot 2017-11-09 at 3.42.29 AM.png

 

I am thinking of adding another few frames dedicated to the lamb vomiting up a strong stream of some sort of life force (kind of like Lynch's Garmonbozia, pain and suffering). It will come out in the form of mist that isn't affected by gravity and sort of floats out) before or after it opens it's eyes wide.

.

References and source materials

In making the video I used the following images: lamb bearer - Kriophoros, Nike of Samothrace's (wing), Deux chien de Chantilly  by Fanfareau e Brillador (tail).

 

Sound Effects

Then sound will be slapstick and unsubtle, like in Terry Gilliam's animations. 

 

Inspiration

Terry Gilliam's animations for Monty Python. 

Cyriak's "Baa"

And another animation that I have yet to find, shown to us in class. 

Meta Mounds: p5js and three force sensitive resistor buttons.

In this project I used an clay shape of a wave, serving as a sort of enclosure. On the three "waves" I placed three sensors. The sensors all work, but I am still trying to figure out how to parse the values and assign them to three different functions. So far, the theta of the wave is controlled by the sensors (all of them) and the vertical height is controlled by the mouse y position.

Next steps: I hope to use another sensor for controlling the amplitude of the wave and using the third sensor to change the background color of the canvas.

Link to the working code, which uses x and y mouse values instead of input so that one can play with this without hooking up an Arduino. 

Large GIF (740x444).gif
Large GIF (528x442).gif
Large GIF (816x440).gif

Below is a previous attempt to create this, along with the code I used to make the animation above. 

 

Large GIF (420x468).gif

The goal of this project was to use arrays and a constructor function to generate cleaner, compartmentalized code. I am using a shape I made (a clay model of an abstract shape that has curves, sort of reminiscent of a cartoony mountain range) and recreating it's shape using arcs. The display function uses the position of x and y (which are set to random) to place the "portrait" or linear representations of the clay shape onto the canvas.  Here is the link to the code to the P5JS editor. 

Questions

1. I am not sure how to make a step repeat pattern. It must be easy but so far I have only been able to generate linear repetitions, not evenly spread out wallpaper-like repetitions. I actually prefer the randomization, but was just curious to try a more orderly step-repeat approach. 

2. I would like to superimpose this wave over the patterns so that they block the top if the screen with the sine wave motion but I cannot get the two sets of code to coexist in one file. When i try to merge the code i get the following error: 

Uncaught TypeError: Cannot read property 'length' of undefined (sketch: line 96)

This is the code

 

Next Steps

The next step of this project is to use serial connection the pressure sensors on the clay shape and manipulate the amplitude, width, and speed of the wave. 

Code (with Errors)


var xspacing = 12;    // Distance between each horizontal location
var w;                // Width of entire wave
var theta = 0;      // Start angle at 0
var amplitude = 75.0; // Height of wave
var period = 400.0;   // How many pixels before the wave repeats
var dx;               // Value for incrementing x
var yvalues;  // Using an array to store height values for the wave

var sensor1;
var sensor2;
var inData;


function setup() {
  createCanvas(710, 400);
 
  w = width+12;
  dx = (TWO_PI / period) * xspacing;
  // dx =1;
  yvalues = new Array(floor(w/xspacing)); 
  // print(dx);

}

//end of setup

function draw() {
  //background(344, 54, 23);
   background(22, 22, 243);

 calcWave();
 renderWave();
  sensor1 = map(inData, 0, width, 0, 5);
 // sensor2 = map(mouseY, 0, height, 1, 3);
}


function calcWave() {
  // Increment theta (try different values for
  // 'angular velocity' here)
  
  theta += 0.02;

  // For every x value, calculate a y value with sine function
  var x = theta;
   
  for (var i = 0; i < yvalues.length; i++) {
    yvalues[i] = sensor2*sin(x * sensor1) * amplitude ;
    x+=dx;
  }
}

// for incoming serial data

function setup() {
  createCanvas(500, 300);
  colorMode(HSB,255,255,255);
  serial = new p5.SerialPort();       // make a new instance of the serialport library
  serial.on('list', printList);  // set a callback function for the serialport list event
  serial.on('connected', serverConnected); // callback for connecting to the server
  serial.on('open', portOpen);        // callback for the port opening
  serial.on('data', serialEvent);     // callback for when new data arrives
  serial.on('error', serialError);    // callback for errors
  serial.on('close', portClose);      // callback for the port closing
 
  serial.list();                      // list the serial ports
  serial.open(portName);              // open a serial port
circlecolor = color(inData, 200, 200);
  
  
  w = width+12;
  dx = (TWO_PI / period) * xspacing;
  // dx =1;
  yvalues = new Array(floor(w/xspacing)); 
  // print(dx);

  
}

function renderWave() {
  noStroke();
  fill(255);
  // A simple way to draw the wave with an ellipse at each location
  

  for (var x = 0; x < yvalues.length; x++) {
    ellipse(x*xspacing, height/105+yvalues[x], 10, -390);
  }
}
 

CODE THAT DOES WORK (but only maps to one sensor)


var xspacing = 12;    // Distance between each horizontal location
var w;                // Width of entire wave
var theta = 0;      // Start angle at 0
var amplitude = 75.0; // Height of wave
var period = 400.0;   // How many pixels before the wave repeats
var dx;               // Value for incrementing x
var yvalues;  // Using an array to store height values for the wave

var sensor1;
var sensor2;


function setup() {
  createCanvas(710, 400);
 
  w = width+12;
  dx = (TWO_PI / period) * xspacing;
  // dx =1;
  yvalues = new Array(floor(w/xspacing)); 
  // print(dx);

}

//end of setup

function draw() {
  //background(344, 54, 23);
   background(22, 22, 243);

 calcWave();
 renderWave();
  sensor1 = map(mouseX, 0, width, 0, 5);
  sensor2 = map(mouseY, 0, height, 1, 3);
}


function calcWave() {
  // Increment theta (try different values for
  // 'angular velocity' here)
  
  theta += 0.02;

  // For every x value, calculate a y value with sine function
  var x = theta;
   
  for (var i = 0; i < yvalues.length; i++) {
    yvalues[i] = sensor2*sin(x * sensor1) * amplitude ;
    x+=dx;
  }
}

function renderWave() {
  noStroke();
  fill(255);
  // A simple way to draw the wave with an ellipse at each location
  

  for (var x = 0; x < yvalues.length; x++) {
    ellipse(x*xspacing, height/105+yvalues[x], 10, -390);
  }
}