Pattern and Sound Series - P5js and Tone.jsl

Using p5js and Tone.js, I created a series of compositions that reconfigure temporarily on input from Kinect, a motion sensor, and generate soothing and energizing synthetic sounds that match the visual elements of the piece. The people in front of the projected visuals and sensor, in effect, create sonic compositions simply by moving, almost like a conductor in front of an orchestra but with more leeway for passive (if one simply walks by) or intentional  interaction.

The sound is easy to ignore or pay attention to. It is ambient and subtle. 

The next steps are to create more nuanced layers of interactivity that allow for more variation of manipulation of the sound and the visuals. Right now I am envisioning the piece becoming a subtle sound orchestra that one can conduct with various hand movements and locations of the screen. 

Composition 1: Ode to Leon's OCD

In the editor.

Composition 2: Allison's Use of Vim Editor

Mouseover version for testing on the computer.

In the editor using Kinect.

Composition 4. "Brother"

In the p5js editor.

For this synchopated piece I sampled sounds of tongue cliks and and claps found on freesound.org. No tone.js was used in this piece.  

Composition 5. "Free Food (in the Zen Garden)"

For this composition I used brown noise from Tone.js. 

In the p5js editor.

After using mouseOver for prototyping, I switched over to Kinect and placed the composition on a big screen to have the user control the pattern with the movement of the right hand instead of the mouse. 

 

Sound

I'm uising Tone.js, written by Yotam Man.

Visuals

P5js and Adobe Illustrator.

Final Compositions

pieces_layed_out_Katya_Rozanova.jpg

 

Initial Mockups

 

Sketches of ideas

 

Inspiration

The simple grid-like designs I am creating are inspired by Agnes Martin's paintings. 

Minimalist and sometimes intricate but always geometric and symmetrical, her work has been described as serene and meditative. She believed that abstract designs can illicit abstract emotions from the viewer, like happiness, love, freedom. 

 

Code

Composition 1: "Ode to Leon's OCD" using mouseover code.

Composition 2: "Allison's Use of Vim Editor" with Kinectron code, using Shawn Van Every and Lisa Jamhoury's Kinectron app and adapting code from one of Aaron Montoya Moraga's workshops.

Composition 2 with mouseover for testing on computer.

Composition 3: "Isobel" with mouseover.

Composition 4 "Brother" with mouseover.

Composition 5. "Free Food (in the Zen Garden)"

Sound Sculpture Playground

I am working on an interactive sonic playground that allows seeing and non-seeing individuals to create - and optionally to collaborate with others on creating - sonic compositions in real time by setting kinetic sculptures into motion either by touching or wearing the sonic objects on their bodies. I began this ongoing project last year at NYU's ITP (Interactive Telecommunications Program) last year and I will work on it again this year.

Last year I learned how to take sensor data from accelerometers and and other motion sensors and feed it into MaxMSP to create real-time generative sound compositions that incorporate sensor data into their algorithm and allow it to impact how the instrument will sound in the future. These ever-changing and impressionable compositions live inside the objects on a raspberry pi and connect to speakers/headphones via bluetooth. Snippets of surrounding space is sometimes captured and fed into the algorithm also, making the space sound part of the permanent DNA of the objects and surfacing again in future compositions. In this way, the sonic objects aren't just instruments but are entities that change over time along with us and affected by us.

So far I have simple prototypes that work and I now have a second year at ITP to create a whole collection of these "instruments", each one with it's own "character" - it's own behaviors and sonic qualities. I intend to use a variety of materials and textures that are interesting to touch and that are durable - plush, inflatable, conducive sting fibers, resin, resin foam, 3d-printed with PLA, and even cement.

The collective soundscapes that result from people engaging with the objects and sometimes with each other are portraits of those interactions and spaces and a testament to collaboration, creativity, and play that are so important for human development and self reflection. These qualities separate us from other animals. This sort of tactile, sonic, mindful making in a space that is highly abstracted from familiarity and removed from the bustle of everyday life is also intended to be a place of meditation that the modern human so desperately needs.

video Block
Double-click here to add a video by URL or embed code. Learn more
 An Egg and a Banana but Abstract (Sound Object Series) at ITP Spring Show 2018.

An Egg and a Banana but Abstract (Sound Object Series) at ITP Spring Show 2018.

 Some recent additions

Some recent additions

 First iteration. A rolly Poly pair.

First iteration. A rolly Poly pair.

The sound playground is a group of objects that emit sound when set into motion, whose sonic behavior changes over time, and whose interactions cannot be predicted with certainty. It encourages play, exploration, and the relinquishing of complete control in the creative process.

The sound playground is a group of kinetic sculptures whose sonic behavior changes over time and whose interactions cannot be predicted with certainty. It encourages play, exploration, calm, and the relinquishing of complete control in the creative process. Fascinated with human ability to remain playful throughout the life cycle, I aim to build a sonic playground that allows people of all ages to create subtle, ever-changing, and sometimes surprising, sonic compositions by setting kinetic sculptures into motion.
In addition to play, I am examining the human interactions with things that they cannot control or learn as well as change over time. The objects absorb bits of the surrounding noise and insert these bits into the existing algorithmic composition, somewhat like new snippets of DNA.  An object's program also changes in response to how it was moved. The experiences of an object is incorporated into it's future behavior, but it changes alongside us using it's own logic. The interactions invite users to examine what it means to them to be in control, especially in control of the creative process. Can we accept this degree of independence and be at ease with it?
 

Parts of the project that are done: 

For the spring show I created objects that emit sound and record sound that changes the composition over time. This is a good start.  See documentation images and video at the top of the page. 

 

To Do: 

1. Use rhino to model the notched shape or create it out of styrofoam?  

2. Use find out a good stopper to close all of the enclosures, both the existing paper mache ones.

3. Finish writing python program

4. set up a way to modulate synth sound without a max

5. Finalize whether i want to use raspberry-pi and go wireless? 

  

Accelerometer 

Above is an early prototype of an accelerometer modulating audio. 

 

Enclosure shapes

I decided on the 6 shapes, hoping to make 3 by the Spring Show.

 

1. Roly Polly with a marble inside to lengthen the duration of the movement. A marble inside inside will  lengthen the duration of the movement..

Going for a Brancucci meets Ernest Neto vibe here. First one is Ernesto:

 Another earlier prototype of rolly poly behavior made of a baloon, tape, plaster, and paint.

Another earlier prototype of rolly poly behavior made of a baloon, tape, plaster, and paint.

 Sliced off the top, to be filled with speaker, accelerometer, raspberry pi. 

Sliced off the top, to be filled with speaker, accelerometer, raspberry pi. 

 Japanese handsaw at the show in ITP did the trick. Next step is using a hard cardboard or wooden plank to create a circular closure with slots that will help let our sound. 

Japanese handsaw at the show in ITP did the trick. Next step is using a hard cardboard or wooden plank to create a circular closure with slots that will help let our sound. 

 a leaner! :) kind of nice how it can lean against a wall. 

a leaner! :) kind of nice how it can lean against a wall. 

Large GIF (422x590).gif
Large GIF (422x590).gif
 

2. a notched, agg-crate enspired boat-like shape that rocks back and fourth. A marble inside will  lengthen the duration of the movement. Right images  inspiration by Amanda Martinez, curretnly on view at Victori and Mo. She uses hand carved wood and plaster and sometimes casts these assembled shapes in resin. Her sculptures are solid but I want mine to be hollow or at least have an opening big enough for electronics and also to be closable.

IMG_5079.JPG
Screen Shot 2018-04-11 at 4.04.42 PM.png
 

3. a shape with facets that can play different qualities of sound when placed on each facet.

 
 

Future shapes

 

4. a pendulum (also maybe with a marble of some magnetic disruptor?)

 

5. If possible, a shape pair that has visual indicators (matching color flat surfaces) that the pair needs to be together. The two units will have proximity sensors (RFduinos) that will modulate the sound when they are moved closer together or. are moved apart.

IMG_4899.JPG

far future

6. plush shapes and plush sound suitsinspiredby Ernesto Neto’s Humanoids. 

FullSizeRender.jpg
IMG_0752.JPG

7. One shape that has a fabric, soft aspect, like a frond-like tassel or mohawk-like tuft.  The last three images are by the artist duo Chiaozza.

Below is the first sculpture of the group. 

IMG_0543.JPG
IMG_0542.JPG
IMG_0574.JPG
 

 

7. Shapes that are filled with sand or marked and make a sound when you turn them over.  

 

FullSizeRender.jpg

 

Color

Some color ideas are below. As you can see in the final prototype, I ended up not going with these palettes this round.

 Currently I am creating paper mache prototypes and experimenting with color.

Currently I am creating paper mache prototypes and experimenting with color.

inspirations

 
 

Electronics

List  of things to have:

  • 3 Arduinos, or Rasp pi?
  • 3 Accel,
  • learn Rhino and make or download the lip of a jar or screw cap to 3d print and then build on to of them with paper mache. 
  • extra battery
  • 3 speakers 

The goal of this project is to explore how to modulate synthesized sound emitted from objects set off into motion. At one point I added a layer of complexity - to change the sound not only based on the movements but also based on the proximity of objects. Due to the complexity of the task, I was advised to scale down the project and leave out the interaction between the objects themselves and focus on the user-object interaction for the first step.

The objects are, currently 3-d printed and paper mache, hollow enclosures of various oragnic and geometric shapes with a weighted bottom. Each one is equipped with an Arduino Uno, Accelerometer, and a HC-05 bluetooth friend used to communicate acceleration of the objects to Max MSP, a software that generates sound, on my laptop. 

 

Sounds

Using Fmpeg and a python script I am finding ways to insert choped up sounds into existing chunks so that the composition of one of the shapes is always evolving and taking abstracted input from the environment. 

Sculpture 1:

Abstract organic mashup of environment over ambient noise

Example of very rough current file. Help from Roland Arnold and Leon Eckert. 

Some sounds similar to these in my Sound and Pattern series project

Possibly also these ambient string instruments that are generating using an algorithm, which also is altered when movement of the sculpture is detected by the accelerometer.

 

Sculpture 2:

Percussion - different rhythms on different sides/planes of the multifaceted enclosure.

 

Sculpture 3:

 Synth or low drone that is modulated with movement of the rolly poly shape

Possibly also these ambient string instruments that are generating using an algorithm, which also is altered when movement of the sculpture is detected by the accelerometer.

 
 

Functionality

Problem & Solution 1

Another simplification I made to the project is I took out the bluetooth speakers from the objects. The reason bluetooth speakers posed a problem was that Max MSP is a program on the computer and the computer can only have one bluetooth connection at a time.  

Problem & Solution 2

Originally I wanted to use an RFID controller, and two RFID transponder stickers to measure the proximity of the objects to one another but I could not find adequate documentation even with the help of a physical computing pro resident, Aaron Parsekian (he was very helpful on other aspects of the project). Furthermore, I was advised that the transponders would require extremely close proximity to one another, like a key card reader, which posed a problem for my project. I am interested in exploring how objects can change sound on much farther distances - somewhere between 1 foot and 5 feet ideally. For that I was advised, by my instructor Ayodamola Okunseinde and by Tom Igoe (professor at ITP and cofounder of Arduino!)  to use an RFduino - a radio frequency Arduino that could be programmed to work within that range. I decided to leave this for my future projects.

I was grateful and delighted to work with Aaron Montoya-Moraga on creating wireless communication between the Arduino and Max MSP. He is a sound artist and resident researcher here at ITP who has extensive experience with Max MSP (I have taken several of his workshops in Max MSP) and who has an engineering background. Our documentation for this project is here

 Sketches of shapes

Sketches of shapes

Potential ways to showcase the sound objects:

In the first iteration, I had a pair, represented below. Because the sound emits from the laptop it is possible to use headphones in a crowded exhibition, like the ITP show, for example. 

IMG_9299.JPG

Another way for the sculpture to live is to sit on a bookshelf and emit relaxing drone sounds upon activation.

 

Fabrication

The first iteration was 3d printed. I made a number of paper-mache ones recently and I plan to use styrofoam and paint to create some of the more complex sctructures. 

IMG_8587.JPG

I decided to make a 3-D model of an irregular oblong shape, reminiscent of an egg, using Rhino software.  Chester Dols helped me create these shapes in Rhino and showed me how to use the 3d printer. He's super helpful and I admire his work. 

First, I printed the enclosure at 20% to see if it is structurally sound:

IMG_8593.JPG

I then printed it full size (about 5 inches in diameter) and had one succesful print, fillowrd by 4 failed attempts. The only printer that would print these is the .6 nozzle, which was then never free again. I have yet to schedule it and attempt to print the rest of the enclosures full size. Alternatively, I may try to print smaller enclosures in the .6 nozzle Ultimakers. 

IMG_8606.JPG

Another variation of these shapes is this:

 

The components

Large GIF (416x228).gif

I'm using two accelerometers and RFID controllers to modulate the sound that eminates from the two objects. 

Sound

Option 1

I am using Max/MXP to create synth patches, which will be modulated by the serial input from the two accelerometers (via Arduino 101, which has a wifi shield) in the sculptures. The sculptures are roly-poly toy shapes that wobble when nudged - this is what creates the accelerometer values. Max will send the sound output via blue tooth to a bluetooth speakers that rest inside the sculptures themselves. Concern: I am a little worried there ill be lag when using wifi and bluetooth to send data back and fourth from the computer. 

version_1_w_Max.jpg

 

Option 2

Another option that might produce immediate sound upon touch would be to use micro SD cards in an Arduino Uno. I am not sure if i could use the serial values of an accelerometer to modulate pre-recorded sounds. 

Another concern, is that the bluetooth speakers aren't loud enough. I will test this out. May need to get an amplifier. 

version_2_w_sdcard.jpg

 

 

Code for Accelerometer 1: MMA_7455


 

Code for Accelerometer 2 - ADXL335 - 5V

Fiber Sculpture Series

This work is part of a larger project - a sonic playground. The Sound Playground is a tactile, interactive sonic playground for people of all ages and abilities (varying motor and seeing skills) to create sonic compositions in real time, alone or collectively. 

 Probably the front    

Probably the front 

 

This is the first of a series of seen soft sculptures from fabric. The fabric and tassel (and in the future interesting also faux fur) covers envelop enclosures with weighed bottoms. These are essentially Weebly wobbles that I may add some sound component to.

 Side view  

Side view  

 Party in the back  

Party in the back  

 The troupe plan

The troupe plan

Fresh pile of pool pipe (It's Garbage! Series)

IMG_0937.GIF
IMG_0901.JPG


Garbage item 1

 Fresh pile of found pool pipe // 2018 3x3x2’. 4” di, 36’ l.

“The dyadic alterations of leaf and air make the frond shimmer and move, even when it stays still.”
- Elaine Scarry on the beauty of palms in “On Beauty And Being Just”.  

The striations of this pool coil by the garbage caught my eye. The mesmerizing black slivers between white rings were like the fronds of a palm that Elaine Scarry talks about in her “On Beauty and Being Just”. I lugged it home to wash it and twist it into knots. It turned out just like I imagined. The huge knot holds itself in place but the lines are in constant motion both in the way the tube bends and in the optical effect of the stripes on the curcumference. 

 

Purpose:

I'm experimenying with using found and discarded materials in creating objects with interesting textures that can be touched and will eventually all come together in a tactile sound sculpture garden/playground that  will serve the general public as well as those who are hard of seeing.

Note:

Upon seeing this documentation several people told me they thought this was rendering in a 3d program, but it’s really just an analogue optical trick. And a treat! It’s a delight to look at in person and get lost in making sense of it, following its curves and seeing it appear to animate with just a slight movement of the body.  

 

Garbage item 2

Fruit carton snake

Taken out of context and placed on a wall, the cartons look like an abstract compositions, a singular entity. The modularity of this sculpture is what allows for them to form one uniform and flexible whole. The viewer is presented with the familiar and unfamiliar at the same time, which creates an interesting dissonance in the mind. 

Data Sonification - Earth's Near Deaths

Project for week 4 of Algorithmic Composition. by Nicolás Escarpentier, Camilla Padgitt-Coles, and Katya Rozanova,

 

Overview

Today our group met up to work on sonifying data using Csound. At first we were planning to build on the work we did on Sunday, where we created a Markov chain to algorithmically randomize the "voice" or lead flute sound from a MIDI file of "Norwegian Wood" over the guitar track using extracted MIDI notes and instruments created in Csound.

Our plan for the data sonification part of the assignment was to also take the comments from a YouTube video of the song and turn them into abstract sounds which would play over the MIDI-fied song according to their timestamp, using sentiment analysis to also change the comments' sounds according to their positive, neutral or negative sentiments. However, upon trying to implement our ideas today we found out that the process of getting sentiment analysis to work is very complicated, and the documentation online consists of many forums and disorganized information on how to do it without clear directives that we could follow.

While we may tackle sentiment analysis later on either together or in our own projects, we decided that for this assignment it would suffice, and also be interesting to us, to use another data set and start from scratch for the second part of our project together. We searched for free data sets and came across a list of asteroids and comets that flew close to earth here (Source: https://github.com/jdorfman/awesome-json-datasets#github-api).

We built 9 instruments and parsed the data to have them play according to their 9 classifications, as well as their dates of discovery, years of discovery, and locations over a 180 degree angle, as well as  each sound reoccur algorithmically at intervals over the piece according to their periods of reoccurrence. We also experimented with layering the result over NASA's "Earth Song" as a way to sonify both the comets and asteroids (algorithmically, through Csound) and Earth (which they were flying over).  The result was cosmic to say the least (pun intended!)

Here are the two versions below.

 

Python script

By Nicolas Nicolás Escarpentier found here.

For each asteroid or comet on the file, we extracted some common characteristics to set the sound parameters. The most important aspect is to portray how often they pass near the Earth, so the representation of the time has to be accurate. We set an equivalence of one month = 5 seconds and a year multiplier of 12 months, in case we wanted to make a longer year to introduce longer periods of silence on the score. The audio file starts on Jan 1, 2010 - the earliest year from the acquired data set. Each rock's discovery date sets its first occurrence on the score, and each occurrence repeats itself according to its period_yr (except for the 'Parabolic Comet', which doesn't have a return period).

month_interval = 5. # in sec
year_mult = 12 # multiplier (how many months in a year)

for a in aster_data:
    # get raw data
		datetime = dateparser.parse(a['discovery_date'])
		yea = datetime.year       # starting time
		mon = datetime.month      # starting time
		day = datetime.day        # starting time

		# first occurrence (starting in 2010)
		start = ((yea-2010)*year_mult + mon + day/30.) * month_interval

		# recursion
		start += recur *year_mult

For the other parameters, we selected characteristics that gave us some expressive possibilities. The pitch of each rock is based on the orbit's angle (i_deg), the instruments are based on the orbit_class, the duration on the q_au_1 (which we have no idea what it actually represents). For the scale of this score, we chose a minor B flat, in reference to the sound of a black hole and the "lowest note in the universe".

Instruments

Camilla, Nicolas, and I  created nine instruments using CSound.

The first three corresponded to the three most common occurring meteors and asteroids. These are subtle "pluck" sounds. A pluck in CSound produces naturally decaying plucked string sounds. 

The last six instruments consisted of louder, higher frequency styles.
Instrument four is a simple oscillator. 
Instrument five, six, and eight are VCO, analog modeled oscillators, with a sawtooth frequency waveform. 
Instrument seven is a VCO with a square frequency waveform. 
Instrument nine is a VCO with a triangle frequency waveform. 

linseg is an attribute we used to add some vibrato to instruments 6 - 9. It traces a series of line segments between specified points. These units generate control or audio signals whose values can pass through 2 or more specified points.

Each instrument's a-rate takes variables p4, p5, and p6, (which we set to frequency, amplitude, and pan) that correspond to values found in the JSON file under each instance of a meteor/asteroid near Earth. The result is a series of plucking sounds with intermittent louder and higher frequency sounds with some vibrato. The former represent to the more common smaller meteors and asteroids and the latter represent the rare asteroid and meteor types. 

 Meteor Art by Meteor art by  SANTTU MUSTONEN  , which I manipulated using Photoshop.  Accidental coding poetry by Nicolas Pena-Escarpenier.  Photo by me.

Meteor Art by Meteor art by SANTTU MUSTONEN , which I manipulated using Photoshop.  Accidental coding poetry by Nicolas Pena-Escarpenier.  Photo by me.

 Description of our code by Nicolás E. ~ See the full project on GitHub  here

Description of our code by Nicolás E. ~ See the full project on GitHub here

IMG_1727 (1).JPG

AI-generated voice reading every comment on 2-hours of Moonlight Sonata on Youtube + Reading amazon reviews of Capital vol 1

Code is on github

It was a pleasure performing this piece at Frequency Sweep #2 at Baby Castles in New York along with other fellow ITP students and alumni. (see video below)

How I did it

In this project for Detourning the Web at ITP, I scraped Youtube comments from a 2-hour looped version of the first part of the Moonlight Sonata and placed them into a JSON file. I then used Selenium, a Python library, to write a script that uploads the comments from a JSON file into Lyrabird, which reads the comment out-loud in my own AI-generated voice. I had previously trained Lyrabird to sound like like me, which adds to the unsettling nature of the project. I based my Selenium code off of the code that Aaron Montoya-Moraga's wrote for his automated emails project.

 

The concept

The work explores themes of loneliness, banality, and anonymity on the web. The comments read out loud give voice to those who comment on this video. The resulting portrait ranges from banal to uncomfortable to extremely personal to comical and even endearing. Online communities have been an interesting place to observe humanity. Often, it’s where people say things they refrain from discussing in the open.

The piece is meant to be performed live. The screen recording below shows what the experience is like. 

 

___

 

CAPITAL VOL 1 Reading

This is a separate but similar project that also uses  Selenium.

For Capital Volume 1 I had Lyrabird simply read its Amazon reviews one by one.  I'm interested in exploring online communities and how they use products, art, or music as a jumping off point for further discussion and forums for expressing their feelings and views. Often people say online things they cannot say anywhere else and it's an interesting way to examine how people view themselves and their environment. 

The piece is meant to be performed live. The screen recording below shows what the experience is like. 

Joan Lamenting a Species Goodbye

 

This is a video that I created using VidPy and FFpeg for Detourning the Web at ITP

Maria Falconetti is sampled here from her role as Joan of Arc in Carl Theodor Dreyer's 1928 silent film, La Passion de Jeanne d'Arc and represents our generation saying goodbye to the polar bear species in the wild. We see ourselves in Maria Falconetti's anguish as we reflect on how negligent capitalist practices have set into motion environmental decline. The last scene shows a polar bear waving it's paw at the viewer but in a clearly artificial, jerky way that VidPy allows for. The viewer understands that the polar bear is not actually waving but the effect is comically heart-wrenching.  

VidPy, is a python video editing library  developed by Sam Lavigne.

Other videos used:
 Boston Robotics BogDog on the beach from youtube,
 Fastest man to run on all fours from youtube
and polar bear footage from youtube

Frenemies in the Greek Gallery/Comoediae Agni

Museum Frenemies/Ram Bearer's Comeuppance/Comoediae Agni

 

Scene 1. "brevis victoria"

The original sketch was for scene 1: Lamb bearer's comeuppance. The title became Brief Victory when I added the final part of the scene that dethrones the lamb from the man's body. 

Statue.jpg

Additional end of scene 1 where the lamb realizes it's victory was short.

sketch_for_panning_4th_scenario.jpg

 

Scene 2. "speranza. giocare"

Lamb grabs man's head and shakes it with satisfaction in teeth like a dog that caught a squirrel or a dog toy. Wags it's tail with glee. This continues. 

Scene 3. "exhalationem spiritus"

Lamb blows at man's head "TBTBTBB" playfully. The head simply falls off.  Possible that the bearer sculpture was never alive like the lamb sculpture. Did the lamb ever have a friend or foe? It vomits in despair or possibly let's out it's spirit. At the end of the scene the wings and the trumpet are uplifting. 

 

sketch_for_panning_3_scenarios.jpg
Screen Shot 2017-11-09 at 3.42.29 AM.png

 

I am thinking of adding another few frames dedicated to the lamb vomiting up a strong stream of some sort of life force (kind of like Lynch's Garmonbozia, pain and suffering). It will come out in the form of mist that isn't affected by gravity and sort of floats out) before or after it opens it's eyes wide.

.

References and source materials

In making the video I used the following images: lamb bearer - Kriophoros, Nike of Samothrace's (wing), Deux chien de Chantilly  by Fanfareau e Brillador (tail).

 

Sound Effects

Then sound will be slapstick and unsubtle, like in Terry Gilliam's animations. 

 

Inspiration

Terry Gilliam's animations for Monty Python. 

Cyriak's "Baa"

And another animation that I have yet to find, shown to us in class. 

Where are we? What's happening?

A mother and child looking onto the world as if to ask "what is happening, where are we, are we dead or alive?". Digital illustration. Inspired by a poster i saw in Slovenia last summer. I still need to find the artist name. Also inspired by Polish Poster design and Roman Klonek's illustations. 

Ram Bearer in P5JS (with sound)

I had mocked up this ram-bearer and ram combo quickly in Photoshop using the timeline function of Photoshop just to get a feeling for placement. I then created the above animation above using the PNG files with transparent backgrounds and the translate function.

Here is the gif animation I had created initially:

The sketch for the idea came out from an animation timeline i created for Animation class:

Statue.jpg

 

Why buttons

Initially I wanted to drag and drop the images but I had trouble with the code for that one so opted to make buttons. My thought was to make them transparent and just have the user click the heads to move them, but I realized I prefer the web2.0 aesthetic of the buttons floating right over the image. It breaks the seriousness of the emotions that could be evoked by the sculpture (uneven power dynamics, helplessness, etc.) and creates a feeling of agency and ease, as if to say "Wouldn't it be nice if changing power dynamics could be as easy as pressing a button!"

 

Sound

Another use for the buttons is the sound I tied to them. For the lamb button I used the Rhinoceros sound and for the man button I used the Screaming Alien sound. Both are from FreeSound.org.

 

Code

var imgBase; // Declare variable 'imgBase'.
var imgLambHeadUpright;
var imgLambDownSleeping;
var imgLambDownOpenEyes;
var imgLambLookingRight;
var imgLambLookingAtMan;
var imgBlank;
var imgHeadMan;
var i;
var lambState = [];
var x;
var y;
var xLamb;
var yLamb;
//var buttonMan;
//var buttonLamb;
var rhino;


function setup() {
  createCanvas(540, 730);
    rhino = loadSound("350424__benjaminharveydesign__rhinoceros-trumpeting-angry.wav");
    blabber = loadSound("188592__alex-audio__screamalien-norberd (1).wav")
x=0;
  xLamb= 0;
  y=height / 8;
  yLamb= height / 8;
  imgBase = loadImage("bodies_noheads_base.png"); // Load the image
  imgLambHeadUpright = loadImage("lamb_head_upright.png");
  imgLambDownSleeping = loadImage("lamb_head_sleeping.png");
  imgLambDownOpenEyes = loadImage("lamb_head_open_eyes.png");
  imgLambLookingRight = loadImage("lamb_head_looking_to_right_softer.png");
  imgLambLookingAtMan = loadImage("lamb_head_looking_at_man.png");
  imgBlank = loadImage("blank.png");
  imgHeadMan = loadImage("head_man.png");

  var i;

  lambState[i] = [imgLambHeadUpright, imgLambHeadUpright, imgLambDownOpenEyes, imgLambLookingRight,
    imgLambLookingAtMan, imgBlank];


  buttonLamb = createButton('Lamb');
  buttonLamb.position(420, 220, 4, 4);
  buttonLamb.mousePressed(nextLambState);

   buttonMan = createButton('Man');
  buttonMan.position(180, 230, 4, 4);
  buttonMan.mousePressed(nextManState);


}


function nextLambState() {
 

 //moving on the horizontal axis
  xLamb = xLamb -30;
  // Moving up at a constant speed
  yLamb = yLamb -1;
  if (rhino.isPlaying() == false) {
      console.log("play!");
      rhino.play();
    }
  }


function nextManState() {
 
 
  // moving on the horizontal axis
  x = x -30;
  // Moving up at a constant speed
  y = y - 14;
  if (blabber.isPlaying() == false) {
      console.log("play!");
      blabber.play();
    }
  
  
  }

function draw() {
  
   
 background(10, 245, 95);
// background(245);
//   background(224, 254, 75);

  // Displays the image at its actual size at point (0,0)
  //image(imgBase, 0, 0);
  // Displays the image at point (0, height/2) at half size
  image(imgBase, 0, height / 8, imgBase.width / 1.7, imgBase.height / 1.7);
  image(imgLambLookingRight, xLamb, yLamb, imgLambLookingRight.width / 1.7, imgLambLookingRight.height / 1.7);
  image(imgHeadMan, x, y, imgHeadMan.width / 1.7, imgHeadMan.height / 1.7);

  //invisible button for man head options
  // noStroke();
 // fill(255, 255, 255, 0);
 // ellipse(260, 200, 160, 160);


  //invisible button for lamb head options

  // noStroke();
  //fill(255, 255, 255, 0);
  //ellipse(420, 220, 160, 160);
}
 
 

Struggles and Potential Changes

Struggle  1

I like how this experience turned out but would like to, in addition, if time allows, explore my initial idea. 

Initially I wanted to create buttons that switched the lamb and man head positions (there are several PNGs, some with the head positioned a certain way, some with eyes closed). I think it would be fun to make a mix and match situation. Perhaps the user wants the sculpture to have no heads at all, or maybe they fancy that both heads are of the lamb, or both of the man. I tried using the display and hide functionsbut to no avail. Will look into it for future projects. 

I also tried using arrays in order to change the state of the elements on mouse clicks, but i could not conjure up the images when I called on these arrays. The only way I managed to have images show up was using the image command, which is not dynamic. I tried printing to see if the array is even "taking" the image I assigned to it but it looks like maybe the array is empty:

 

 

Here are some of the permutations, some with eyes open and some closed that I would have liked to appear for the mix-and-match version of this work:

 

Struggle 2.

I wanted to add psychedelic colorful strips to the back of the png images but when I did, they made the png images disappear. I could add the boxes on top, however, which looked cool but not what i wanted. 

Screen Shot 2017-11-13 at 8.00.10 AM.png

 

Inspiration and sources

I used a photograph of the greek Kriophoros - Lamb Bearer - statue. There are many similar ones across cultures but this one appealed to me because it's the most classic for the Western world and I feel more comfortable defacing it. This is probably why Terry Gilliam use classic statues from antiquity also. Terry Gilliam was the inspiration here - the slapstick, irreverent, over the top aesthetic is borrowed from him. 

It's possible that Jodorowsky's themes of the slave becoming the master, or the son killing the father to take his place, have also played a role. Both Jodorowky and Gilliam are pretty playful and this is why I like their work. 

Fangs using Giphy Cam

Probably the best art I've ever made. Thank you giphy cam! #giphycam 

Large GIF (474x456).gif