On Opacity

Sitting between video essay and exhibition, this work peruses various ways that people hide themselves and objects for various ends.

Opacity can be unsettling or frightening, whether it is something sinister like soldiers in military camouflage waiting to ambush someone or just unsettling like a mysterious object wrapped in something opaque. On the other hand it is highly sought after for safety.

To complicate things, it can also be quite (often darkly) humorous that people attempt to hide with various degrees of success, attempting to merge with bushes and the like. Monty Python's How Not to Be Seen (from Flying Circus Season 2, 1970) and Hito Steyrl's HOW NOT TO BE SEEN: A FUCKING DIDACTIC EDUCATIONAL .MOV FILE, 2013 are inspirations.

Some of the questions the work attempts to asks are: who gets to suffer being watched (monitored, targeted) and, conversely, who gets to be seen (in full complexity and safety)? Who gets to decide what is hidden and what isn’t? Playing on the desire to camouflage I nod toward ever being’s desire for or right to opacity, inspired by Éduard Glissant’s ontology of relation (‘On Opacity’, Poetics of Relation, 1998), which is where the title comes from, and other artists who work with ideas around both hiding and revealing the identity (Nick Cave’s sound suits, for example) as a way to put forth a more complex view of each other.

Video uses free 3d objects found in Blender resources.

Sounds I created using samples:

Granular synthesis of Tales of Hoffman Barcarole. (Cuent - KATHRYN MEISLE and MARIE TIFFANY) from the Internet Archive

Granular synthesis of hippos found on FreeSound.org

Triptych

In progress. A video collage using Paris protest footage from @Decolonizethisplace, cows in a Slovenian village (Velika Planina), and Pussy Riot’s invasion of World Cup field in 2017, all superimposed onto a 3d scan of a Medieval triptych. Orinigally inspired by the triple window fire of the Paris protest footage.


Worlds

I used Cinema 4D to create an abstracted world of opacity, new identities, and other posthuman possibilities. This is part of a world-making project that speculates futures in figurative (and sometimes prefigurative) ways.

Inspired by Maurizio Cattelan's Gilded toilet at the Guggenheim Museum of Art in NY. Cinema 4d.

Another part of the world-making video project.

AI-generated voice that learned to sound like me reading every comment on 2-hours of Moonlight Sonata on Youtube + Reading amazon reviews of Capital vol 1

It was a pleasure performing this piece, called AI-generated voice that learned to sound like me reading every comment on 2-hours of Moonlight Sonata on Youtube, at venue called Baby Castles in New York in 2017.

\

In this project I scraped Youtube comments from a 2-hour looped version of the first part of the Moonlight Sonata and placed them into a JSON file. I then used Selenium, a Python library, to write a script that uploads the comments from a JSON file into Lyrabird, which reads the comment out-loud in my own AI-generated voice. I had previously trained Lyrabird to sound like like me, which adds to the unsettling nature of the project. I based my Selenium code off of the code that Aaron Montoya-Moraga's wrote for his automated emails project.

The code for this project can be found on github

AI-generated voice that learned to sound like me reading every comment on 2-hours of Moonlight Sonata on Youtube explores themes of loneliness, banality, and anonymity on the web. The comments read out loud give voice to those who comment on this video. The resulting portrait ranges from banal to uncomfortable to extremely personal.

The piece is meant to be performed live.

 

___

 

CAPITAL VOL 1 Reading

This is a separate but similar project that also uses Selenium.

For Capital Volume 1 I had Lyrabird simply read its Amazon reviews one by one.  I'm interested in exploring online communities and how they use products, art, or music as a jumping off point for further discussion and forums for expressing their feelings and views. Often people say online things they cannot say anywhere else and it's an interesting way to examine how people view themselves and their environment. 

The piece is also meant to be performed live.

Bold & Shy - Using VidPy and FFMEG

This is a video that I created using VidPy and FFpeg

 

 

VidPy, is a python video editing library  developed by Sam Lavigne.

Screens, Portals, Men, and Frodo

In this video piece I sampled video footage of Steve Jobs, Star Trek TNG, Lord of the Rings, and a nature video about summer that had poetry text. All of the videos were downloaded from youtube using youtube-dl, fragmented in FFMPEG, and put together with jerky offsets using VidPy, a python script developed by Sam Lavigne.  

Themes explored: men as adventurers, technology as men's realm, legendary and real iconic figures and the grey area between, male as default gender in pop culture.

 

Code for Video 1, A Species Goodbye:

 

Python:

 Code on my github

 

FFmpeg:

ffmpeg -ss 00:06:18 -i jobs_640_480.mp4 -c:v copy -c:a copy -t 6 jobs_640_480_1.mp4

ffmpeg -ss 00:07:43 -i jobs_640_480.mp4 -c:v copy -c:a copy -t 5 jobs_640_480_2.mp4

ffmpeg -ss 00:07:51 -i jobs_640_480.mp4 -c:v copy -c:a copy -t 5 jobs_640_480_3.mp4

ffmpeg -ss 00:11:08 -i jobs_640_480.mp4 -c:v copy -c:a copy -t 3 jobs_640_480_4.mp4

//cutting out a portion from jobs interview that shows the screen

ffmpeg -ss 00:00:06 -i sheliak_636_480.mp4 -c:v copy -c:a copy -t 5 sheliak_636_480_1.mp4

got error in python file while runing singletrack:

Katyas-MBP:screens crashplanuser$ python screens.py

objc[40458]: Class SDLTranslatorResponder is implemented in both /Applications/Shotcut.app/Contents/MacOS/lib/libSDL2-2.0.0.dylib (0x10870af98) and /Applications/Shotcut.app/Contents/MacOS/lib/libSDL-1.2.0.dylib (0x108b6c2d8). One of the two will be used. Which one is undefined.

objc[40459]: Class SDLTranslatorResponder is implemented in both /Applications/Shotcut.app/Contents/MacOS/lib/libSDL2-2.0.0.dylib (0x110617f98) and /Applications/Shotcut.app/Contents/MacOS/lib/libSDL-1.2.0.dylib (0x110a892d8). One of the two will be used. Which one is undefined.

//get alien

ffmpeg -ss 00:00:29 -i sheliak_636_480.mp4 -c:v copy -c:a copy -t 1.5 sheliak_636_480_p.mp4

//get riker

ffmpeg -ss 00:00:38 -i holodeck.mp4 -c:v copy -c:a copy -t 10 holodeck.mp4_1.mp4

//get lotr chunk

ffmpeg -ss 00:00:34 -i lotr.mp4 -c:v copy -c:a copy -t 3 lotr_late.mp4

//couldn’t get this to make a sound