Augmented Reality Sandtable


The Augmented Reality Sandtable is an interactive, digital sand table that uses augmented reality technology to create a 3D battlespace map. It was developed by the Human Research and Engineering Directorate at the Army Research Laboratory to combine the positive aspects of traditional military sand tables with the latest digital technologies to better support soldier training and offer new possibilities of learning. It uses a projector to display a topographical map on top of the sand in a regular sandbox as well as a motion sensor that keeps track of changes in the layout of the sand to appropriately adjust the computer-generated terrain display.
An ARL study conducted in 2017 with 52 active duty military personnel found that the participants who used ARES spent less time setting up the table compared to participants who used a traditional sand table. In addition, ARES demonstrated a lower perceived workload score, as measured using the NASA Task Load Index ratings, compared to the traditional sand table. However, there was no significant difference in post-knowledge test scores in recreating the visual map.

Development

The ARES project was one of the 25 ARL initiatives in development from 1995 to 2015 that focused on visualizing spatial data on virtual or sand table interfaces. It was developed by HRED’s Simulation and Training Technology Center with Charles Amburn as the principal investigator. Collaborations involved with ARES included Dignitas Technologies, Design Interactive, the University of Central Florida’s Institute for Simulation and Training, and the U.S. Military Academy at West Point.
ARES was largely designed to be a tangible user interface, in which digital information can be manipulated using physical objects such as a person’s hand. It was constructed using commercial off-the-shelf components, including a projector, a laptop, an LCD monitor, Microsoft’s Xbox Kinect sensor, and government-developed ARES software. With the projector and Kinect sensor both facing down on the surface of the sandbox, the projector provides a digital overlay over the sand and the Kinect sensor scans the surface of the map to detect any user gestures inside the boundaries of the sandbox.
During development, researchers explored the possibility of incorporating ideas such as multi-touch surfaces, 3D holographic displays, and virtual environments. However, budget restrictions limited the implementation of such ideas.
On September 2014 during the Modern Day Marine exhibition in Quantico, Virginia, researchers from ARL showcased ARES for the first time.

Uses

According to a 2015 technical report by ARL scientists, ARES is reported to have the following capabilities.