3D visualization is a powerful, newly accessible technology that can captivate audiences with an immersive, hands-on experience. The Augmented Reality Sandbox repurposes Microsoft Kinect’s 3D camera, projector, and open source software to accurately model terrain elevation and fluid physics, transforming a plain kids’ sandbox into a dynamic, surreal landscape responsive to touch and hand gestures. Vibrantly colored mountains, canyons, islands, flowing rivers, lakes and ocean waves appear nearly instantly at your fingertips, and you can summon rain or drought locally or even over the entire sandbox with the push of a button. More importantly, it realistically simulates natural disasters such as coastal sea level rise, flooding from levees or dam breakage, or even volcanoes and lava flows.
I first experienced this inspiring device at the Fall 2014 American Geophysical Union meeting in San Francisco, the largest international conference for Earth and space scientists, and knew we must have one at UCLA! Thanks to a mini-grant from the UCLA Office for Instructional Development, our sandbox was completed in Spring of 2015 by Gary Glesener of the Modeling and Educational Demonstration Lab-Curriculum Materials Center (MEDL-CMC) and his team of 3 undergrads in the UCLA Department of Earth, Planetary, and Space Sciences. The technology for our sandbox was based on technology developed by UC Davis researchers, with NSF funding for informal science education.
Though it was intended as a teaching tool to augment classroom lectures on topographic mapping and basic geo or hydrological concepts, it has immense appeal for science outreach to general audiences young and old; thousands of people have seen it in person at UCLA and many more on Youtube and social media. It is even currently showcased at the Cantor Fine Art’s #PleaseTouchtheArt as an interactive tactile exhibit. It’s amazing seeing kids jump right in and learn how to interpret the land and water features intuitively without even realizing. Even with visually impaired and blind audiences, when we explained how the colors and water reacted as they sculpted the sand, their faces lit up and their imaginations took off! It’s really a transformative experience for both the audience and us, the presenters, giving people the power to sculpt and interact with their own worlds.
Since we study planetary science, our eventual goal is to expand the software to model exotic environments, transporting people to distant moons and planets with vastly different gravity and surface features. Imagine sculpting methane lakes on Saturn’s moon Titan or even ice volcanoes on Pluto!
Emmanuel V. Masongsong is a Project Specialist in UCLA’s Department of Earth, Planetary, and Space Sciences with the Experimental Space Physics research group, which studies the solar wind’s interaction with Earth’s magnetic field, known as space weather. Besides conducting outreach activities with electromagnetism, the aurora, and related geospace sciences, he is currently a staff mentor with the ELFIN Cubesat project, teaching students how to machine metal parts that will make up UCLA’s first homebuilt satellite to launch in 2017.