Wireshark Capture


I took a deeper look into my network traffic using the network analysis tools Herbivore and Wireshark. At first glance a Wireshark capture looks a little intimidating and unfamiliar, so first task was exporting the capture data as a CSV file and looking at some of the variables in javascript. It didn't tell me too much except that the majority of my packets were using TCP. Herbivore gave a more user friendly view of the route of your internet activity. Looking into unfamiliar domains that popped up in my normal internet use, I saw a few examples of cookies, a mobile safari ad tracking software called Kochava, and most interestingly a url that pointed to a webpage with a single haiku. (Looking up this poem returns a website, Addthis.com, that wants you to download some files to help with your "security"...).   

Screen Shot 2017-10-16 at 7.50.42 PM.png
Screen Shot 2017-10-16 at 8.04.36 PM.png
Screen Shot 2017-10-16 at 8.09.39 PM.png
Screen Shot 2017-10-16 at 9.35.50 PM.png
Screen Shot 2017-10-15 at 12.58.26 PM.png
Screen Shot 2017-10-15 at 1.03.15 PM.png
Screen Shot 2017-10-16 at 9.40.56 PM.png
Screen Shot 2017-10-16 at 7.25.09 PM.png
Screen Shot 2017-10-15 at 1.14.33 PM.png
Screen Shot 2017-10-16 at 9.44.10 PM.png



Gradients + Fog


For my first experiment with Unreal Engine I aimed to create a simple immersive experience based on the works of time/space artists like James Turrell or Robert Irwin and to get a handle on working with sound in a game engine. On a reflective surface, I positioned different colored lights to mimic a CYMK color wheel and added steam particle systems to give the light more body. 

Screen Shot 2017-10-04 at 3.27.05 PM.png
Screenshot (14).png

Caustic Waves

link for mobile

(best on mobile with headphones)


Going a little deeper with exploring 360 video, I wanted to make an immersive dreamscape based on an experience that stuck with me over the years. To bring in some effects I needed for the transitions I combined a few After Effects compositions in made with found footage and Ricoh Theta videos I took. I chose Premiere to edit these scenes together with a score made in Reaper. (If you're interested, in my dream the glowing light at the end represented all of time and space condensed into a single, floating bar of light).

Screen Shot 2017-09-27 at 9.16.32 PM.png
Screen Shot 2017-09-27 at 1.23.12 PM.png

(if you prefer YouTube).

Voyager 360 Spatial Audio with Video

link for mobile

(best on mobile with headphones)


To become more familiar with Facebook's Spacial Workstation for 360 audio, I chose to make a sound piece on the Voyager mission (with many opportunities for panning). As I was researching the mission, I came across a RadioLab interview with Ann Druyan, Carl Sagan's wife, and a hidden love story slowly emerged. Ann was directing the content for the Golden Record representing our world and culture, when she and Carl got engaged. Days afterwards, her vitals were recorded as she was thinking about Carl Sagan, and those recordings were included in the record on the Voyager spacecraft. The sound clips I used in editing this piece were all sampled from the same Golden record. For the video, I recorded a 360 video with a Theta from the inside of a globe/bar I found on the sidewalk. I edited the video in After Effects to create the illusion of looking at the stars from inside the globe (minus some masking/keying issues). 

Screen Shot 2017-09-20 at 6.17.08 PM.png
Screen Shot 2017-09-20 at 6.17.22 PM.png

Network Hops



live map

As an exercise to better understand the traceroute terminal command, I used mapbox and an IP lookup API to track the locations of the routers and servers used in my frequently visited websites. Unfortunately facebook is among them (marked as the death icon), as well as gmail (rocketship icon) and my personal website (markered with a sleeping icon since I need sleep). In the javascript console, I'm logging the complete json returned by the IP API, including the company name, city of the server, and order in which it was sent. 

Screen Shot 2017-09-17 at 12.50.09 PM.png
Screen Shot 2017-09-18 at 8.25.42 PM.png
Screen Shot 2017-09-18 at 8.53.21 PM.png
Screen Shot 2017-09-18 at 8.55.03 PM.png
Screen Shot 2017-09-18 at 11.30.59 PM.png
Screen Shot 2017-09-19 at 2.40.51 PM.png
Screen Shot 2017-09-24 at 1.35.43 PM.png
Screen Shot 2017-09-24 at 3.26.54 PM.png
Screen Shot 2017-09-25 at 11.19.28 AM.png

Regex Image


As a supplement to my experiment The Last Question, I set out to generate an image with regex characters and symbols. I kept the short story generation in the first iteration, and added a function that *attempts* to redraw an uploaded file with characters based on the image pixel value. The characters are selected based on their pixel value. 

Screen Shot 2017-09-19 at 10.24.20 AM.png
Screen Shot 2017-09-19 at 10.31.19 AM.png
Screen Shot 2017-09-19 at 10.31.33 AM.png
Screen Shot 2017-09-19 at 10.33.43 AM.png
Screen Shot 2017-09-19 at 10.37.56 AM.png
Screen Shot 2017-09-19 at 10.40.01 AM.png
Screen Shot 2017-09-19 at 2.19.56 PM.png
Screen Shot 2017-09-19 at 2.20.20 PM.png

Voyager AR with BlippAR


In this straightforward exercise with BlippAR, I used the stereo version of a 360 sound piece I recently created in the style of a Radio Lab episode. In researching the Voyager mission I stumbled upon the love story between Carl Sagan and Ann Druyan and decided that would be a more interesting focus. After some testing with BlippAR, I chose an image of the Golden Record (containing recordings of what humans are all about) as a marker for a 3D model of the Voyager and the sound piece. 

Screen Shot 2017-09-16 at 6.59.04 PM.png

Arcade Joystick Controller


With the requirements to create a controller for Left, Right, Up, Down, and Quit controls, I set up a SparkFun switch with an arduino and neopixels highlighting selected directions. The housing was a departure from the typical "ITP box" (a bamboo box from the Container Store with an acrylic, laser etched lid), instead I aimed to reuse materials from the junk shelf. 


The Last Question


live website

This website was an experiment in accessing DOM elements in p5 and generating text with Rita.js. Once an image is uploaded, the RGB pixel value of each pixel determines the generated text sourced from Isaac Asimov's short story, "The Last Question". I mapped the combined RGB value of each pixel to the order of words as they appear in the short story. The darker the pixel, the further the word lives in the text.

Screen Shot 2017-09-10 at 6.57.55 PM.png
Screen Shot 2017-09-12 at 1.02.28 AM.png
Screen Shot 2017-09-12 at 12.23.51 AM.png
Screen Shot 2017-09-12 at 2.07.44 AM.png

Submarine Sound Piece


To get started in 360 sound design, we first took a look at Reaper and used it to construct a narrative with audio only. I wanted to play around with created different spaces with different background ambiances and saw a submarine as the perfect setting - I could travel from a very small metallic space to an underwater scene. The story itself is a bit morbid, but thanks to a Paulstrecthed Brian Wilson, the ending's a bit happier. 

Screen Shot 2017-09-17 at 11.13.11 AM.png