Based on the urgency I felt for a BBC Planet Earth-quality film to be made about this treasured, but little known mountain range and river network, I embarked on the filming of KLAMATH in 2012.

I performed the cinematography, drone work, video editing, script writing, voice narration, music selection & licensing, website design, and print graphics.

The Klamath-Siskiyou Mountains – known in scientific communities as the "Galapagos of North America" – are a center of rare biological diversity, which evolved to survive unusual climatic and geological contrasts.

Setting out to produce a documentary requires certain unawareness of the innumerable pieces and sub-pieces which compose a film. I had expected this project to take 6 months. Four years later, I was ready to publish KLAMATH. Because of impending threats to important wildlife migration corridors within the region, I released the complete documentary online, free for public viewing.

The complete documentary is linked below:

GPU Computation and Rendering

Imag(in)ing reality, with lasers

Touch and drag on the visualization below to interact. On a mobile device, tilt to look up and down. Click left and right arrows to transition between scenes

Stout Grove Redwoods
Jedediah Smith State Park, California
LiDAR scan © Carlos G. Velazquez, Aaron Moffatt
Martin Luther King Jr. Memorial
Washington, D.C.
LiDAR scan by Carlos G. Velazquez, EpicScan
Martin Luther King Jr. Memorial
Washington, D.C.
LiDAR scan by Carlos G. Velazquez, EpicScan
Oregon Caves National Monument, Oregon
LiDAR scan by Carlos G. Velazquez, EpicScan
Coit Tower
San Francisco, California
LiDAR scan by Carlos G. Velazquez, EpicScan

These scenes began as hundreds of millions of points – 3-D pixels – captured by a LiDAR scanner – a specialized camera using lasers to see depth as well as color. As founder of AiR VR Studio, I developed a framework for the Unity3D game engine to render, edit, and interact with these point clouds in VR.

Based on that code, I wrote this lightweight Javascript/WebGL engine to share point clouds on the web, to give you a glimpse of what walking through a point cloud in VR might feel like – without having to give you a VR headset.

The magic in this engine is the interactivity it supports between user and point world. A custom physics engine is running on every point, and it even runs on smartphones. I reworked the engine computations to fit within the constraints of iOS and Android WebGL platforms. Since WebGL doesn't support compute shaders, the rasterizer pipeline has to be hacked and slightly abused to run the physics computations as well as render the point data. More importantly, the state vectors have to be encoded across multiple image buffers. In the case of Android (and the render texture formats it supports) this was particularly painful.

In the VR version, you can fly through these spaces, gaining perspectives from treetops or beneath the ground. You can touch pixels with your hands, changing their shape and size – painting, bending, and twisting the virtual world around you.


The challenge of transporting viewers to places as unreachable as the Klamath­-Siskiyou led me to immersive media, and into a new form of journalism.

These are clips I posted to social media after my experience on November 20th, 2016, within hours of arriving in Standing Rock, SD. This was my second trip. This night came to be among the most violent encounters between the #WaterProtectors and law enforcement. Well over 300 individuals were treated for hypothermia that night, having been drenched by the continually running water cannon. Temperatures dropped to 25°F. Tear gas, rubber bullets, and concussion grenades were employed throughout the encounter as well.

In this 360 video, click and drag on the display to look around
(on mobile devices you may have to open this video in the YouTube app):

I operated a documentary video camera, as well as a portable 360-degree VR camera on behalf of ShiftingPOV and YES! Magazine. Working with Jeris Huntington of Shifting POV, our VR work received sponsorship by Oculus Launch Pad and the Google Jump Program.

As a journalistic tool, 360 video has the distinct advantage over traditional media of seeing in all directions: in volatile conflict situations, a traditional camera is rarely pointing in the right direction at the instant an important event occurs.
The all-seeing nature of the 360 camera allowed us to return to the footage to study events as they unfolded – even those which we did not ourselves see, because they occured behind us.


My other great love is music. My violin studies commenced at age 11, inspired by a film of Itzhak Perlman. The picture and sound on my VHS tape became blurry through repeated use, but it still translated magic from that original performance, and I discovered nuances each time I watched it. The pictures were a window into that other world, which inspired my pursuit of music.

At age 16, I began a series of annual violin concerts, each about 90 minutes and performed from memory. The first concert, featuring ‘Winter’ from Vivaldi’s Four Seasons, the Mendelssohn Violin Concerto in E minor, and the Brahms Violin Concerto in D major, was the same program as Perlman had played in that VHS. Over the next four years, I played a number of the great violin concertos including works by Dvorak, Mozart, Paganini, Khachaturian, Vaughan Williams, and Ravel.

Khachaturian Violin Concerto, Mov. 3
(recording 2007)
Serenade in D for Flute, Violin, Viola by Beethoven
(recording 2012)
"Summer" from The Four Seasons by Vivaldi
Performed on baroque instruments and gut strings (recording 2014)

Interactive Touch-less Displays

For large LED displays and projection, I developed an interactivity module, drawing input from a Kinect depth sensor to allow users to “touch” the 3-D world within the display without actually touching the screen.

The video below shows people interacting with these displays for the first time. There are two visuals pictured:

Both of these visuals are created using the AiRStudio Unity3D plugin I developed:

Sound Visualization | AiR Listen

Developed first as a composition tool for myself (I wanted to see visually a violin’s overtone series in relationship to the chords and melodies of a song), I later released AiR Listen to the iOS App Store based on interest from others. Most frequency-domain graphs lay out amplitudes of the audio spectrum along a straight line. But our brains don't comprehend sound in that way. This novel helical visualization maps more naturally onto our own experience of sound. It portrays note relationships across octaves.

In AiR Listen, each octave is represented by one loop of the helix, and as a result, the tone spectrum is visually organized by note letter: A, B, C, D, E, F, G (plus semitones). All tones within the same note letter (or "pitch class") become vertically stacked.

For example, if you look straight down on the helix (seeing a circle) while multiple octaves of the note "A" are being played, those A's will all visually line up with each other, appearing as one. When you look at the helical model from the side however, the octave of each tone is revealed. This visualization works well on a phone, but it becomes even more intuitive viewed in 3-dimensional space via virtual reality headset or hologram.

With time, each of the glowing particles created by tones drift outward. As the app listens to the sounds around you, it generates a 3-D record of what you are hearing and have heard. The circular and spiral correlation with a traditional music record is only a coincidence, as the two 'records' are of very different format, but the similarity is fascinating nonetheless.

Video screen capture of app while listening to music:

From a technical perspective, AiR Listen leverages the same high performance GPU point engine I developed for LiDAR pointcloud rendering, but it uses an entirely different data structure and behavior set. This is part of the power of the AiRStudio Unity Plugin – the ability to extend the base point system for widely different applications, while maintaining very high performance optimization.

Visit on iOS App Store


“The most profound technologies are those that disappear. They weave themselves into the fabric of everyday life until they are indistinguishable from it.”

Through technology, music, and film, my work revolves around sharing glimpses of our world from perspectives that we couldn't reach before.

My motivation to document and better understand the natural world is captured in the words of Mark Weiser, above: the biological mechanisms which keep us alive have developed over a 3.8 billion year evolution. They are woven into the fabric of everything about and around us, but they are invisible to the naked eye.

Please reach me by email at a@airxr.studio

© 2018 Aaron Moffatt