About Our Work

Sculpting Visualizations grew out of the possibilities that advances in 3D printing, augmented reality (AR), and virtual reality (VR) create for computing tools that integrate with our physical environment and take advantage of our physical abilities. We study how these technologies can support 3D data visualization, an increasingly important and common scientific activity. It is funded by a grant from the National Science Foundation’s Information and Intelligent Systems Division.

The project’s interdisciplinary team includes computer scientists, artists, neuroscientists, geologists, and oceanologists. Daniel Keefe and Francesca Samsel are the Primary Investigators. Learn more about the team here.

Projected Research and Outcomes

  • Use 3D printing to develop physical representations of 3D data (“physical data forms”) that match the needs of specific analysis domains and tasks, and through this develop principles for doing 3D design for visualizations in general.
  • Design hybrid spaces that use AR visual representations along with physical data forms, looking at ways to leverage the strengths of each and developing ways to interact with the data through both the virtual and physical forms.
  • Create tools that leverage these design principles and interaction techniques to allow scientists to create new physical data forms and hybrid visualizations to address outstanding data analysis challenges in brain imaging, geology, and, ultimately, many scientific fields.
  • The work will support interdisciplinary courses at the intersection of art, science, computing, and data visualization at the PIs’ institutions. Students will also be trained in research methods and work with the research team to develop public science museum exhibits that raise awareness of both the technology and the science involved.

Methodology

To leverage the possibilities of rapid, creative, artistic iteration and exploration of physical form, the team will develop interfaces and algorithms for capturing and extracting properties of physical forms, along with tools for exploring mappings between these properties and 3D data, a design theory and taxonomy, and a library catalog for using physical inputs in visualization processes. Learn more about the library here.

These physical elements will be augmented with colocated digital head-tracked stereoscopic displays that directly incorporate the printed objects into the AR experience, along with touch-based interaction techniques such as touch-sensitive input surfaces or the direct inclusion of physical widgets in the printed objects. These visualizations will be evaluated through user studies based on existing methodologies for comparing 3D vector field visualization methods. The team will then develop exploratory visualization tools, using streamlined versions of the catalog and visualizations developed earlier to help manage the complexity of creating new visualizations while teaching visual design processes to scientists through the use of the tools, recasting the scientific task of data exploration as a creative process of visualization design to support learning, engagement, and effective analysis. These tools will be iteratively developed by teams of art and computer science students in conjunction with domain scientists and used to facilitate data exploration and discovery, as well as to bring science more directly into the public sphere through interactive experiences, such as at science museums.

NSF 1704604 and 1704904

September 1, 2017 – August 31, 2021 (Estimated)

PI (UMN): Professor Daniel Keefe

PI (UT-Austin): Francesca Samsel