The Sound of a :)

I call this piece the sound of a smile. I know this won’t be a seminal work that changes the course of music composition but I think it’s the best i can do as a beginner. I tweaked code from class (a sequence) and used a MonoSynth to play it. I used P5js and Tone.js libraries.

Screen Shot 2018-10-14 at 11.06.46 PM.png


sketch link

The things I learned are:

What the notes “C4” and “F2” etc. mean. I was able to determine by listening to the notes laid out in order which one was which. Then I looked this up and found https://www.becomesingers.com/vocal-range/vocal-range-chart

I referenced it to understand better what sounds are within normal range.

I wanted to see if i can replace the notes with just frequencies and it sounded like garbage so I decided to go back to the more musical version of this sketch and changed the sound of the envelope a bit. I also tried replacing the MonoSyth with PolySynth and PluckSynth but didn’t like the sound as much. I decided play this sketch starting out with a smiley face and turning it into something more abstract. The end of the sketch actually starts to sound kind of interesting.

See it here on video:

Making Media Making Devices documentation

Setting up raspberry Pi

Exercise 1: 

IMG_3795.JPG
IMG_3791.JPG
IMG_3788.JPG

 

Exercise 2

 

 

IMG_3797.JPG
IMG_3803.JPG
IMG_3800.JPG

 

Exercise 3

Getting rid of while loop. Now we go into an event driven stuff. 

Used a button to turn something on. 

 

Video  

 

 

 

 

IMG_3843.JPG
IMG_3845.JPG
IMG_3847.JPG
IMG_3841.JPG

Anita and I did it!  

image.jpg

Week 3: Code of music assignment

Here is this sketch

Couldn’t record a composition using Quicktime for some reason. Maybe something wrong with the mic.

For this assignment I created a composition based on one of the examples from class. I used the random function too play the notes that I specified in an array. I remember I used numbers instead of notes (for example just 12 instead of "C2”) in my previous simple compositions using tone.js and I liked the effect of having very low notes. I think that the low notes are only audible when they are put through some sort of filter so in this case, these frequencies were simply not played.


Screen Shot 2018-10-01 at 2.46.32 PM.png


Week 2 Rhythm assignment

Here is the sketch with some rhythm explorations. I edited the sketch from class. Still a little new to this so this is a pretty simple iteration but along the aesthetic i would like to create for the final project.

Not sure how to connect the actual sound to the visuals yet. i imagine it's some dor of audio mapping to variables that can be used as parameters in shapes. Will try to do this this week. Perhaps this will be clarified in class.

I’d like to learn how to create objects out of small lines that budge a little in reaction to volume and rhythm changes.Some of the lines could belong and respond to one instrument and the other lines could represent the other instrument, etc. If there are three instruments, like in this pice, there will be three groups of lines.

Code of Music - 1

For our first assignment for Luisa Pereira’s Code of Music we created simple sequencers using recorded sounds. I found all of the sounds I wanted to use on Freesound.org.

https://editor.p5js.org/katya/sketches/S12IWdnuQ

I used many sounds from freesound.org

The main one i used was this tuning of string instruments.

https://freesound.org/people/luisvb/sounds/329925/

https://freesound.org/people/Robinhood76/sounds/171446/

https://freesound.org/people/SamsterBirdies/sounds/345043/

this is a nice inspiration though i wanted something more chaotic.

Copy of Pattern and Sound Series - P5js and Tone.jsl

Using p5js and Tone.js, I created a series of compositions that reconfigure temporarily on input from Kinect, a motion sensor, and generate soothing and energizing synthetic sounds that match the visual elements of the piece. The people in front of the projected visuals and sensor, in effect, create sonic compositions simply by moving, almost like a conductor in front of an orchestra but with more leeway for passive (if one simply walks by) or intentional  interaction.

The sound is easy to ignore or pay attention to. It is ambient and subtle. 

The next steps are to create more nuanced layers of interactivity that allow for more variation of manipulation of the sound and the visuals. Right now I am envisioning the piece becoming a subtle sound orchestra that one can conduct with various hand movements and locations of the screen. 

Composition 1: Ode to Leon's OCD

In the editor.

Composition 2: Allison's Use of Vim Editor

Mouseover version for testing on the computer.

In the editor using Kinect.

Composition 4. "Brother"

In the p5js editor.

For this synchopated piece I sampled sounds of tongue cliks and and claps found on freesound.org. No tone.js was used in this piece.  

Composition 5. "Free Food (in the Zen Garden)"

For this composition I used brown noise from Tone.js. 

In the p5js editor.

After using mouseOver for prototyping, I switched over to Kinect and placed the composition on a big screen to have the user control the pattern with the movement of the right hand instead of the mouse. 

 

Sound

I'm uising Tone.js, written by Yotam Man.

Visuals

P5js and Adobe Illustrator.

Final Compositions

pieces_layed_out_Katya_Rozanova.jpg

 

Initial Mockups

 

Sketches of ideas

 

Inspiration

The simple grid-like designs I am creating are inspired by Agnes Martin's paintings. 

Minimalist and sometimes intricate but always geometric and symmetrical, her work has been described as serene and meditative. She believed that abstract designs can illicit abstract emotions from the viewer, like happiness, love, freedom. 

 

Code

Composition 1: "Ode to Leon's OCD" using mouseover code.

Composition 2: "Allison's Use of Vim Editor" with Kinectron code, using Shawn Van Every and Lisa Jamhoury's Kinectron app and adapting code from one of Aaron Montoya Moraga's workshops.

Composition 2 with mouseover for testing on computer.

Composition 3: "Isobel" with mouseover.

Composition 4 "Brother" with mouseover.

Composition 5. "Free Food (in the Zen Garden)"

Explorations of enclosures

These are exploratons of shapes for a sound object series "sound playground" project I am working on at ITP. Full project here

The sound playground is a meditative, interactive sculpture garden that encourages play, exploration, and calm. Fascinated with human ability to remain playful throughout their lives, I want to create a unique experience that allows people of all ages to compose sound compositions by setting kinetic sculptures in motion.

In addition to play, I am examining the concept of aging and changing over timeIn addition to play, I am examining the concept of aging and changing over time. The sound-emitting kinetic objects absorb bits of the surrounding noise and insert these bits into the existing algorithmic composition, somewhat like new snippets of DNA.  A sculpture's program also changes in response to how it was moved. In this way, the experiences of a sculpture is incorporated into an ever-changing song that it plays back to us, changing alongside us.

IMG_2874.JPG
IMG_3093.JPG
Screen Shot 2018-05-23 at 3.34.27 PM.png
Screen Shot 2018-04-11 at 3.47.57 PM.png
Screen Shot 2018-04-11 at 4.04.42 PM.png
Screen Shot 2018-03-20 at 6.29.16 PM.png

Bot That Books All The Office Hours

For my Detourning the Web final project at ITP I made a bot that books all the office hours with the instructor, Sam Lavigne.  If I hadn't figured this out, I would have left one office hour booked so I could do this.

Thanks to Aaron Montoya-Moraga for help with this.  

Screen Shot 2018-04-24 at 12.43.43 AM.png
 Photo courtesy of Sam Levigne

Photo courtesy of Sam Levigne

 Photo courtesy of Sam Levigne

Photo courtesy of Sam Levigne

Bold & Shy - Using VidPy and FFMEG

This is a video that I created using VidPy and FFpeg for Detourning the Web at ITP

 

 

VidPy, is a python video editing library  developed by Sam Lavigne.

Screens, Portals, Men, and Frodo

In this video piece I sampled video footage of Steve Jobs, Star Trek TNG, Lord of the Rings, and a nature video about summer that had poetry text. All of the videos were donwloaded from youtube using youtube-dl, fragmented in FFMPEG, and put together with jerky offsets using VidPy, a python script developed by Sam Levigne.  

Themes explored: men as adventurers, technology as men's realm, legendary and real iconic figures and the grey area between, male as default gender in pop culture.

 

Code for Video 1, A Species Goodbye:

 

Python:

 Code on my github

 

FFmpeg:

ffmpeg -ss 00:06:18 -i jobs_640_480.mp4 -c:v copy -c:a copy -t 6 jobs_640_480_1.mp4

ffmpeg -ss 00:07:43 -i jobs_640_480.mp4 -c:v copy -c:a copy -t 5 jobs_640_480_2.mp4

ffmpeg -ss 00:07:51 -i jobs_640_480.mp4 -c:v copy -c:a copy -t 5 jobs_640_480_3.mp4

ffmpeg -ss 00:11:08 -i jobs_640_480.mp4 -c:v copy -c:a copy -t 3 jobs_640_480_4.mp4

//cutting out a portion from jobs interview that shows the screen

ffmpeg -ss 00:00:06 -i sheliak_636_480.mp4 -c:v copy -c:a copy -t 5 sheliak_636_480_1.mp4

got error in python file while runing singletrack:

Katyas-MBP:screens crashplanuser$ python screens.py

objc[40458]: Class SDLTranslatorResponder is implemented in both /Applications/Shotcut.app/Contents/MacOS/lib/libSDL2-2.0.0.dylib (0x10870af98) and /Applications/Shotcut.app/Contents/MacOS/lib/libSDL-1.2.0.dylib (0x108b6c2d8). One of the two will be used. Which one is undefined.
objc[40459]: Class SDLTranslatorResponder is implemented in both /Applications/Shotcut.app/Contents/MacOS/lib/libSDL2-2.0.0.dylib (0x110617f98) and /Applications/Shotcut.app/Contents/MacOS/lib/libSDL-1.2.0.dylib (0x110a892d8). One of the two will be used. Which one is undefined.

//get alien

ffmpeg -ss 00:00:29 -i sheliak_636_480.mp4 -c:v copy -c:a copy -t 1.5 sheliak_636_480_p.mp4

//get riker

ffmpeg -ss 00:00:38 -i holodeck.mp4 -c:v copy -c:a copy -t 10 holodeck.mp4_1.mp4

//get lotr chunk

ffmpeg -ss 00:00:34 -i lotr.mp4 -c:v copy -c:a copy -t 3 lotr_late.mp4

//couldn’t get this to make a sound

 

Sound sculpture prototype with Arduino and accelerometer

 

Part 1.  10/2/17.

Using tape and plaster is not ideal. My next steps are: making smooth, sandable sculptures using styrofoam and clay, then casting them in a foam resin that will allow for sound to travel from inside the enclosure. 

 

 

 

Part 2. Update. 10/9/17.

I made a roly-poly enclosure so that it can move and make music on it's own when pushed. I used plaster to create a heavy center of mass on the bottom.

 

I am happy with the fact that the rolly polly effect is working. The rolling could be smoother and the shell more polished. I have yet to try 3d printing and resin casting for these shapes. Questions remain:

1. What's the best way to fasten the Arduno inside the shape? 

2. How do I make it so that the Arduino turns off on it's own when the shape isn't moving - do i have to use a button or can it shut off and on by movement activation? Perhaps a touch sensor can act as a button.

I think the easiest way to make an enclosure that can open and close easily is to download a 3d printable container with a screw lid, then modify the top of the lid and body of the jar using Blender. The result would be a shape that opens and closes without a noticeable, distracting latch. 

 

Sound

For the first pass I'm happy with the synth sound. For the next iteration, I'd like to try to use the Mozzi library to create some more variation of sound and manipulate the relationship between current and sound mapping so that it's not mapped linearly. Perhaps some unexpected inverse relationship can be explored. 

 

Code

I found a library in the documentation that came with my accelerometer and used it in my own code, which used the variables that the accelerometer produced to manipulate the frequency of sound synthesized by the speaker. 

 

#include <Wire.h>               //Include the Wire library
#include <MMA_7455.h>           //Include the MMA_7455 library

MMA_7455 accel = MMA_7455();    // Make MMA7455 object
int SpeakerOut = 9;
char xVal, yVal, zVal;          // Return value variables


void setup() {
  Serial.begin(9600);           // Use the Serial Monitor window at 9600 baud
  
  // Set the g force sensitivity: 2=2g, 4=4g, 8-8g
  accel.initSensitivity(2);
  
 // Provide oiffset values so that sensor displays 0, 0, 63
 //  (or close to it) when positioned on a flat surface, all pins
 //  facing down
 
 // Update the numbers with your own values from the MMA7455_CalibrateOffset sketch.
  accel.calibrateOffset(0, 0, 0);

  pinMode(SpeakerOut, OUTPUT);
}

void loop() {
 
  // Get the X, Y, anx Z axis values from the device
  xVal = accel.readAxis('x');   // Read X Axis
  yVal = accel.readAxis('y');   // Read Y Axis
  zVal = accel.readAxis('z');   // Read Z Axis
  
  // Display them in the Serial Monitor window.
  Serial.print("X: = ");
  Serial.print(xVal, DEC);
  Serial.print("   Y: = ");
  Serial.print(yVal, DEC);
  Serial.print("   Z: = ");
  Serial.println(zVal, DEC);
  delay(1000);

int frequency2 = map(yVal, -90, 90, 100, 680);


}

   
   

Persistence Attempt

For the Dynamic Web Development course at ITP I was able to make servers that take and output data. I have made attempts to use MongoDB, a database, and JSON files to store the values users entered into the text fields in the apps I created. 

 

Approach one: MongoDB

While trying to run this server, I ran into an issue with the "save" property on line 114  "db.mycollection.save({"name":textvalue}, function(err, saved) {" . This was confusinf since I only had 123 lines in my code. 

I tried to find out other ways to write the code but to no avail. Every MongoDB tutorial was slightly different and Some suggested npm installing mongoose , others instructed to download Robot 3T, and nothing seemed to work. 

Screen Shot 2018-03-16 at 10.41.09 PM.png
 

Approach two: JSON files

I attempted to save data to a JSON file and simply render the data file in the ejs template. I watched these videos by Coding Rainbow and referenced this Stack Overflow page. However, that didn't seem to work. The names.JSON file in my "public" folder did not seem to take any of the code lines and remained unaltered. I have yet to make this work. For now the data collected is only on the server by way of temporary memory but it's able to spit out some input as output in a story. For now, it's as dynamic as my web development gets.

 

Next Steps

I'll try some more attempts at persistence soon. I need to figure out which way is better and easier, database or JSON. My hunch is that figuring out the Mongo DB will pay off because it's much more elegant and private than simply collecting all the user input and letting it live unsecured in a public server.

The result is something like this

Screen Shot 2018-03-16 at 10.35.06 PM.png

All the code can be found on my Github.