ITP Thesis Presentation. May 10th, 2018.

About

Customizing Reality: Immersive Painting, explores speculative environments in AR through the lens of digital, immersive painting. I used softwares like Quill and Tilt Brush to prototype possible future scenarios in AR/MR that answer the question:

How would you customize your reality? In a world where mixed reality eyewear is integrated into our lives, how would you augment the space around you and why would you want to?  

As my thesis research project for ITP, I dedicated the first half of my research to developing a workflow that allowed me to rapidly prototype AR environments as handheld applications and fully immersive VR environments. The second stage of my research narrowed my focus to explore three categories of customization: environment, people, and narrative.   

 Environment

Environment

 People

People

 Narrative

Narrative

Process

Quill's animation tool was at the center of my research and much of the actual work was in learning how to animate paintings with it. The tedious method is frame by frame animation in which I’ll start with a baseline painting, duplicate that frame and make a small adjustment to its copy. Slowly, over the course of sometimes hundreds of frames, I’ll create simple movements.

A more powerful and friendly method is looping a sequence of frames and slowly adding to it to build more complicated compositions. I usually start with a base layer of about 10 to 20 frames and every brushstroke is recorded and played back in the same loop. 

I also relied on Quill’s other animation features like the Nudge tool. It allowed me to push and pull brushstrokes and distort them in any way. With this sunflower I duplicated the first frame about 20 times and, when looped, the nudge tools effect is recorded in the animation. 

Workflow

To bring those animations and other virtual painting assets into mixed reality I developed a workflow that I relied on a game engine software, like Unity or Unreal, and Apple’s ARKit.  With this pipeline I built mobile applications that overlaid these paintings onto the real world. 

t1_2.png
Screen Shot 2018-02-27 at 12.50.19 PM.png

 

Environment

I chose a popular reference as an example and reimagined Washington Square Park as an impressionist painting (inspired by Van Gogh and Seurat). Continuing my methodology of painting on photographs, I used a 360 image of Washington Square Park and slowly interpreted the sky, every tree and every leaf in this style of painting. 

I then created a VR scene where users can toggle the background image on and off and experience the augmented park as an immersive experience, much like they would in a future with ubiquitous mixed reality eyewear.  

unityVG.gif
animation00026.gif
col2.jpg
col3.jpg

 

People

Inspired by several clothing collections, I created personal animations intended to track and overlay onto a person as a speculative digital accessory and a future form of self expression.  Using several designer’s recent runway collections as reference, I reinterpreted each piece on a 3D model of a standing figure. 

To illustrate these animations being used, I rigged them to 3D characters with walking animations and placed them in a VR scene of Washington Square Park. From this vantage point, the characters will indefinitely walk through the park in a never-ending loop. 

 

Narrative

I built a database of illustrative paintings that depict a specific action, character, environment, and many other elements that can be combined to create a story. Quill animation files (alembic) at the time of this project were not compatible with iOS devices, so as a work around, I recorded each animation and converted them to transparent video files in After Effects. 

In a mobile application, users can add these paintings to a mixed reality scene and rearrange them to create their own narrative. I plan to work on this application in the future and create a tool that makes mixed reality more accessible for creators to engage with the medium.