Physical rendering processes for more graspable extended reality data visualizations

Loading...
Thumbnail Image

Persistent link to this item

Statistics
View Statistics

Journal Title

Journal ISSN

Volume Title

Published Date

Publisher

Abstract

This dissertation examines new approaches to extended reality visualization that use physical rendering processes to make data more graspable: both more comprehensible and more embodied. Current extended reality visualizations enable new data insights and better comprehension of spatial data, but they lack the physical embodiment and tactility that has been crucial to past scientific breakthroughs. Current data physicalizations have this tactility yet lack in the interactivity required for today’s scientific sensemaking processes. In this dissertation, I explore new ways to combine the embodiment of data physicalizations and the comprehensibility of extended reality visualizations to make data more graspable using physical rendering processes, which make it possible to create physical objects from digital data and enable physical elements from the real world to be used in digital visualizations. The dissertation presents four main contributions supported by physical rendering processes that span the extended reality continuum. To begin, I present a systematic design exploration of the first 3D printed spatial data physicalizations encoding scalar data as glyphs on a 3D surface using traditional, forward physical rendering processes. Then, I introduce a new, inverse physical rendering process, a software architecture and user interface enabling the design of multivariate 3D spatial data visualizations with handcrafted physical media. Combining the forward and inverse physical rendering processes from the first two contributions, I penultimately present a new approach to querying 3D spatial data using multi-touch input directly on a data physicalization. Finally, I present the design and results of the first empirical study on the effectiveness of data physicalizations for spatial data analysis tasks when compared with state-of-the-art virtual reality and 2D geographic information science visualizations. Multiple conclusions are drawn from the explorations into spatial data physicalizations and extended reality visualization. First, data physicalizations have the potential to be more comprehensible than digital visualizations when completing spatial data analysis tasks, and such benefits are most likely tied to the act of physically touching the data. Second, inverse physical rendering processes that encode digital data with elements from the physical world bring a new level of embodiment to digital extended reality visualizations, and they show many of the benefits of physicalization, potentially through imagined touch. Lastly, new approaches that tightly integrate physicalizations with digital interactivity lead to new opportunities for collaboration and engagement in scientific sensemaking processes.

Description

University of Minnesota Ph.D. dissertation. December 2023. Major: Computer Science. Advisor: Daniel Keefe. 1 computer file (PDF); viii, 153 pages.

Related to

Replaces

License

Collections

Series/Report Number

Funding information

Isbn identifier

Doi identifier

Previously Published Citation

Other identifiers

Suggested citation

Herman, Bridger. (2023). Physical rendering processes for more graspable extended reality data visualizations. Retrieved from the University Digital Conservancy, https://hdl.handle.net/11299/271358.

Content distributed via the University Digital Conservancy may be subject to additional license and use restrictions applied by the depositor. By using these files, users agree to the Terms of Use. Materials in the UDC may contain content that is disturbing and/or harmful. For more information, please see our statement on harmful content in digital repositories.