- Role: VFX and Technical Artist;
- Tools: Snap Lens Studio, Amplify Shader Editor / Unity Shader Graph, Google Depth API;
- Duration: 2 Weeks
- How to use visual effects / motion graphics to create a more cohesive feedback loop between the physical world and digital world? The purpose is to use real time depth information from the AR camera to establish presence for the audience.
- How can we use this as a template to replicate across future 3D installation art pieces? Creating AR projection lighting applied to real time depth allows for artist/creators to create future projection lighting installation projects.
My inspiration comes from projection light installation projects such as Team Lab or Van Gogh Immersive Exhibition.
I created a series of fluid like procedural 3D textures using Snap Material Editor, overlapping different noise textures to create a natural texture with different alpha offsets. These procedural textures are inspired by similar 3D procedural textures used in Unity’s Amplify Shader Editor.
These texture nodes are inspired heavily by tools such as Substance Material, Substance Designer, all of which have a similar procedural texture workflow.
We then retrieve the depth information needed to create the AR projection lighting:
Artists and designers can use the inspector on the right to change the noise intensity, color, and alpha animation offsets, allowing for ease of customization of your AR projection lighting.
The final outcome is an AR projection lighting piece you can use to create motion graphics that are projected across the real time depth information the AR camera is reading from.