Pattern and Sound Series - P5js and Tone.js

Using p5js and Tone.js, I created a series of compositions that reconfigure temporarily on input from Kinect, a motion sensor, and generate soothing and energizing synthetic sounds that match the visual elements of the piece. The people in front of the projected visuals and sensor, in effect, create sonic compositions simply by moving, almost like a conductor in front of an orchestra but with more leeway for passive (if one simply walks by) or intentional  interaction.

The sound is easy to ignore or pay attention to. It is ambient and subtle. 

The next steps are to create more nuanced layers of interactivity that allow for more variation of manipulation of the sound and the visuals. Right now I am envisioning the piece becoming a subtle sound orchestra that one can conduct with various hand movements and locations of the screen. 

Composition 1: Ode to Leon's OCD

In the editor.

Composition 2: Allison's Use of Vim Editor

Mouseover version for testing on the computer.

In the editor using Kinect.

Composition 4. "Brother"

In the p5js editor.

For this synchopated piece I sampled sounds of tongue cliks and and claps found on freesound.org. No tone.js was used in this piece.  

Composition 5. "Free Food (in the Zen Garden)"

For this composition I used brown noise from Tone.js. 

In the p5js editor.

After using mouseOver for prototyping, I switched over to Kinect and placed the composition on a big screen to have the user control the pattern with the movement of the right hand instead of the mouse. 

 

Sound

I'm uising Tone.js, written by Yotam Man.

Visuals

P5js and Adobe Illustrator.

Final Compositions

pieces_layed_out_Katya_Rozanova.jpg

 

Initial Mockups

 

Sketches of ideas

 

Inspiration

The simple grid-like designs I am creating are inspired by Agnes Martin's paintings. 

Minimalist and sometimes intricate but always geometric and symmetrical, her work has been described as serene and meditative. She believed that abstract designs can illicit abstract emotions from the viewer, like happiness, love, freedom. 

 

Code

Composition 1: "Ode to Leon's OCD" using mouseover code.

Composition 2: "Allison's Use of Vim Editor" with Kinectron code, using Shawn Van Every and Lisa Jamhoury's Kinectron app and adapting code from one of Aaron Montoya Moraga's workshops.

Composition 2 with mouseover for testing on computer.

Composition 3: "Isobel" with mouseover.

Composition 4 "Brother" with mouseover.

Composition 5. "Free Food (in the Zen Garden)"

Fiber Sculpture Series

This work is part of a larger project - a sonic playground. The Sound Playground is a tactile, interactive sonic playground for people of all ages and abilities (varying motor and seeing skills) to create sonic compositions in real time, alone or collectively. 

Probably the front

Probably the front

This is the first of a series of seen soft sculptures from fabric. The fabric and tassel (and in the future interesting also faux fur) covers envelop enclosures with weighed bottoms. These are essentially Weebly wobbles that I may add some sound component to.

Side view

Side view

Party in the back

Party in the back

The troupe plan

The troupe plan

Chair in contrapposto

I created this chair to stand on a plinth for the Future of Sculpture class. I’m interested in exploring the awkwardly placed bodies both human and nonhuman. This piece explores the body of a chair that mimics the exaggerated stance of the human in contrapposto, possibly mocking the human’s leisurely stance and at the same time perhaps genuinely demonstrating its desire to appear human - or at least as important as a human (how important is that?) - and demand recognition of its chairness without inviting us to sit. There is, after all, no seat and in this case it’s a maquette. The opposite of Alexa, the chair might be playfully inviting us pay attention to the nonhuman actors around us.

IMG_7417.JPG

The sound piece is riffing off of Alvin Lucier's "I am Sitting in a Room", where the artist sits on a chair "recording himself narrating a text, and then playing the tape recording back into the room, re-recording it. The new recording is then played back and re-recorded, and this process is repeated. Since all rooms have characteristic resonance or formant frequencies (e.g. different between a large hall and a small room), the effect is that certain frequencies are emphasized as they resonate in the room, until eventually the words become unintelligible, replaced by the pure resonant harmonies and tones of the room itself.[" . In this case the chair that the artist sat on dethrones the artist. 

Fresh pile of pool pipe (It's Garbage! Series)

IMG_0937.GIF
IMG_0901.JPG


Garbage item 1

 Fresh pile of found pool pipe // 2018 3x3x2’. 4” di, 36’ l.

“The dyadic alterations of leaf and air make the frond shimmer and move, even when it stays still.”

- Elaine Scarry on the beauty of palms in “On Beauty And Being Just”.

The striations of this pool coil by the garbage caught my eye. The mesmerizing black slivers between white rings were like the fronds of a palm that Elaine Scarry talks about in her “On Beauty and Being Just”. I lugged it home to wash it and twist it into knots. It turned out just like I imagined. The huge knot holds itself in place but the lines are in constant motion both in the way the tube bends and in the optical effect of the stripes on the curcumference. 

 

Purpose:

I'm experimenying with using found and discarded materials in creating objects with interesting textures that can be touched and will eventually all come together in a tactile sound sculpture garden/playground that  will serve the general public as well as those who are hard of seeing.

Note:

Upon seeing this documentation several people told me they thought this was rendering in a 3d program, but it’s really just an analogue optical trick. And a treat! It’s a delight to look at in person and get lost in making sense of it, following its curves and seeing it appear to animate with just a slight movement of the body.  

 

Garbage item 2

Fruit carton snake

Taken out of context and placed on a wall, the cartons look like an abstract compositions, a singular entity. The modularity of this sculpture is what allows for them to form one uniform and flexible whole. The viewer is presented with the familiar and unfamiliar at the same time, which creates an interesting dissonance in the mind. 

Banana Synth and Kinetic Sound Object

Object 1 of 2 - Banana Synth

IMG_8502.JPG

My projects for the Code of Music are two - The first is an Infinite regressive banana synth that encourages interspecies listening. I combined with project for my research work in a course I took at the same time called Temporary Expert. This is a slightly comical but earnest and approachable foray into encouraging people to find respite from the I-It relationships that plague our society thruogh interspecies/interentity listening. Presentation for this can be found here.

This synth is equipped with an accelerometer that sends data to MaxMSP via bluetooth, triggering the sound of the banana being pulled off of this same banana bunch that I previously recorded. When the x-axis goes into the positive, the sound volume turns on, the sound speeds up and gains a wet synthy sound. When the banana is tilted to the negative pole, it does the same thing but in reverse. In this way it becomes a kinetic and sonic object.

What would a banana sound like to us? I think the sound doesn’t matter - so long as it sounds abstract enough to be believable - as long as we are encouraged to listen to it.

Below is a presentation that shows the research and thought process behind this object.

Vimeo

Arduino Code for accelerometer.

This example code is in the public domain.

  http://www.arduino.cc/en/Tutorial/AnalogInput
*/

int sensorPinZ = A0;    // z 
int sensorPinY = A1;    // y
int sensorPinX = A2;    // x

//int ledPin = 13;      // select the pin for the LED
int sensorValueZ = 0;  // variable to store the value coming from the sensor
int sensorValueY = 0;  // variable to store the value coming from the sensor
int sensorValueX = 0;  // variable to store the value coming from the sensor


void setup() {
  // declare the ledPin as an OUTPUT:
//  pinMode(ledPin, OUTPUT);

  Serial.begin(9600);
}

void loop() {
  // read the value from the sensor:
  sensorValueZ = analogRead(sensorPinZ);
  sensorValueY = analogRead(sensorPinY);
  sensorValueX = analogRead(sensorPinX);

  // turn the ledPin on
  //digitalWrite(ledPin, HIGH);
  // stop the program for <sensorValue> milliseconds:
  //delay(sensorValue);
  // turn the ledPin off:
  //digitalWrite(ledPin, LOW);
  // stop the program for for <sensorValue> milliseconds:
  //delay(sensorValue);

  // print the results to the Serial Monitor:
  //Serial.print("sensor = ");
  Serial.print(sensorValueX);
    Serial.print(" ");

   Serial.print(sensorValueY);
     Serial.print(" ");
    Serial.println(sensorValueZ);
}


Max Patch that interprets the serial signal and assigns it to sound design.

Screen Shot 2018-12-16 at 4.50.18 PM.png
 

Object 2 of 2 - Sound-Emitting Tactile Kinetic Sculpture Tries To Be Therapeutic

I designed and 3d printed this tactile kinetic object that makes analogue tapping sound that will be supplemented with a synthesized rhythmic tapping I designed in Tone.js and will recreate in MaxMSP. Inside is a ball bearing that is pulled by the magnets embedded intermittently in the floor of the object with cement. The movement is therefore broken up into discreet movements (this will become more apparent when I replace the ball bearing with a heavier one and add stronger magnets).

I plan to finish this piece by mapping the rhythmic sound design to the rocking movement using Adafruit Feather with Bluetooth and an accelerometer. I will update this page once I have done this. For now you can see the shape playtested in the video as well as fabrication documentation below:


I used Vectorworks to create the shape and the team at LaGuardia studios was generous with their time, especially Taylor Abshur, in fixing the geometries in order to prep the files for printing.



A Not-Only-Human Centered Design Manifesto

Human-centered design can be disastrous In how it allows us to impose our will over objects, other species, and nature in general. It also seeps into how we treat other people that we think are inferior and even sometimes ourselves. Often the human centrality plays out in design without regard for what that one-directional relationship does to our perception of ourselves and the culture we are creating.

Examples of such design are Alexa and Uber.

How can we design interactions that change how we see ourselves as a society, how we treat beings and things? How can we create a more balanced ecosystem?

 

Please take a look at a project ii'm working on that takes some inspiration from <3 posthuman theory <3

 http://www.katyarozanova.com/tempexpert2/2018/10/24/collaborative-instrument-ecosystem-detailed-project-description

IMG_4670.GIF
IMG_4672.GIF

Thank you Giphycam for the art for these statements.  

Data Sonification - Earth's Near Deaths

Project for week 4 of Algorithmic Composition. by Nicolás Escarpentier, Camilla Padgitt-Coles, and Katya Rozanova,

 

Overview

Today our group met up to work on sonifying data using Csound. At first we were planning to build on the work we did on Sunday, where we created a Markov chain to algorithmically randomize the "voice" or lead flute sound from a MIDI file of "Norwegian Wood" over the guitar track using extracted MIDI notes and instruments created in Csound.

Our plan for the data sonification part of the assignment was to also take the comments from a YouTube video of the song and turn them into abstract sounds which would play over the MIDI-fied song according to their timestamp, using sentiment analysis to also change the comments' sounds according to their positive, neutral or negative sentiments. However, upon trying to implement our ideas today we found out that the process of getting sentiment analysis to work is very complicated, and the documentation online consists of many forums and disorganized information on how to do it without clear directives that we could follow.

While we may tackle sentiment analysis later on either together or in our own projects, we decided that for this assignment it would suffice, and also be interesting to us, to use another data set and start from scratch for the second part of our project together. We searched for free data sets and came across a list of asteroids and comets that flew close to earth here (Source: https://github.com/jdorfman/awesome-json-datasets#github-api).

We built 9 instruments and parsed the data to have them play according to their 9 classifications, as well as their dates of discovery, years of discovery, and locations over a 180 degree angle, as well as  each sound reoccur algorithmically at intervals over the piece according to their periods of reoccurrence. We also experimented with layering the result over NASA's "Earth Song" as a way to sonify both the comets and asteroids (algorithmically, through Csound) and Earth (which they were flying over).  The result was cosmic to say the least (pun intended!)

Here are the two versions below.

 

Python script

By Nicolas Nicolás Escarpentier found here.

For each asteroid or comet on the file, we extracted some common characteristics to set the sound parameters. The most important aspect is to portray how often they pass near the Earth, so the representation of the time has to be accurate. We set an equivalence of one month = 5 seconds and a year multiplier of 12 months, in case we wanted to make a longer year to introduce longer periods of silence on the score. The audio file starts on Jan 1, 2010 - the earliest year from the acquired data set. Each rock's discovery date sets its first occurrence on the score, and each occurrence repeats itself according to its period_yr (except for the 'Parabolic Comet', which doesn't have a return period).

month_interval = 5. # in sec
year_mult = 12 # multiplier (how many months in a year)

for a in aster_data:
    # get raw data
		datetime = dateparser.parse(a['discovery_date'])
		yea = datetime.year       # starting time
		mon = datetime.month      # starting time
		day = datetime.day        # starting time

		# first occurrence (starting in 2010)
		start = ((yea-2010)*year_mult + mon + day/30.) * month_interval

		# recursion
		start += recur *year_mult

For the other parameters, we selected characteristics that gave us some expressive possibilities. The pitch of each rock is based on the orbit's angle (i_deg), the instruments are based on the orbit_class, the duration on the q_au_1 (which we have no idea what it actually represents). For the scale of this score, we chose a minor B flat, in reference to the sound of a black hole and the "lowest note in the universe".

Instruments

Camilla, Nicolas, and I  created nine instruments using CSound.

The first three corresponded to the three most common occurring meteors and asteroids. These are subtle "pluck" sounds. A pluck in CSound produces naturally decaying plucked string sounds. 

The last six instruments consisted of louder, higher frequency styles.
Instrument four is a simple oscillator. 
Instrument five, six, and eight are VCO, analog modeled oscillators, with a sawtooth frequency waveform. 
Instrument seven is a VCO with a square frequency waveform. 
Instrument nine is a VCO with a triangle frequency waveform. 

linseg is an attribute we used to add some vibrato to instruments 6 - 9. It traces a series of line segments between specified points. These units generate control or audio signals whose values can pass through 2 or more specified points.

Each instrument's a-rate takes variables p4, p5, and p6, (which we set to frequency, amplitude, and pan) that correspond to values found in the JSON file under each instance of a meteor/asteroid near Earth. The result is a series of plucking sounds with intermittent louder and higher frequency sounds with some vibrato. The former represent to the more common smaller meteors and asteroids and the latter represent the rare asteroid and meteor types. 

Meteor Art by Meteor art by  SANTTU MUSTONEN &nbsp;, which I manipulated using Photoshop.&nbsp; Accidental coding poetry by Nicolas Pena-Escarpenier.&nbsp; Photo by me.

Meteor Art by Meteor art by SANTTU MUSTONEN , which I manipulated using Photoshop.  Accidental coding poetry by Nicolas Pena-Escarpenier.  Photo by me.

Description of our code by Nicolás E. ~ See the full project on GitHub  here

Description of our code by Nicolás E. ~ See the full project on GitHub here

IMG_1727 (1).JPG

AI-generated voice reading every comment on 2-hours of Moonlight Sonata on Youtube + Reading amazon reviews of Capital vol 1

Code is on github

It was a pleasure performing this piece at Frequency Sweep #2 at Baby Castles in New York along with other fellow ITP students and alumni. (see video below)

How I did it

In this project for Detourning the Web at ITP, I scraped Youtube comments from a 2-hour looped version of the first part of the Moonlight Sonata and placed them into a JSON file. I then used Selenium, a Python library, to write a script that uploads the comments from a JSON file into Lyrabird, which reads the comment out-loud in my own AI-generated voice. I had previously trained Lyrabird to sound like like me, which adds to the unsettling nature of the project. I based my Selenium code off of the code that Aaron Montoya-Moraga's wrote for his automated emails project.

 

The concept

The work explores themes of loneliness, banality, and anonymity on the web. The comments read out loud give voice to those who comment on this video. The resulting portrait ranges from banal to uncomfortable to extremely personal to comical and even endearing. Online communities have been an interesting place to observe humanity. Often, it’s where people say things they refrain from discussing in the open.

The piece is meant to be performed live. The screen recording below shows what the experience is like. 

 

___

 

CAPITAL VOL 1 Reading

This is a separate but similar project that also uses  Selenium.

For Capital Volume 1 I had Lyrabird simply read its Amazon reviews one by one.  I'm interested in exploring online communities and how they use products, art, or music as a jumping off point for further discussion and forums for expressing their feelings and views. Often people say online things they cannot say anywhere else and it's an interesting way to examine how people view themselves and their environment. 

The piece is meant to be performed live. The screen recording below shows what the experience is like. 

Joan Lamenting a Species Goodbye

 

This is a video that I created using VidPy and FFpeg for Detourning the Web at ITP

Maria Falconetti is sampled here from her role as Joan of Arc in Carl Theodor Dreyer's 1928 silent film, La Passion de Jeanne d'Arc and represents our generation saying goodbye to the polar bear species in the wild. We see ourselves in Maria Falconetti's anguish as we reflect on how negligent capitalist practices have set into motion environmental decline. The last scene shows a polar bear waving it's paw at the viewer but in a clearly artificial, jerky way that VidPy allows for. The viewer understands that the polar bear is not actually waving but the effect is comically heart-wrenching.  

VidPy, is a python video editing library  developed by Sam Lavigne.

Other videos used (Youtube):
 Boston Robotics BigDog on the beach,
 Fastest man to run on all fours.
and polar bear footage.