Computer Graphics

University of California - Berkeley

Inferring Private Personal Attributes of Virtual Reality Users from Ecologically Valid Head and Hand Motion Data


Abstract

Motion tracking “telemetry” data lies at the core of nearly all mod- ern virtual reality (VR) and metaverse experiences. While generally presumed innocuous, recent studies have demonstrated that motion data actually has the potential to uniquely identify VR users. In this study, we go a step further, showing that a variety of private user information can be inferred just by analyzing motion data recorded from VR devices. We conducted a large-scale survey of VR users (N=1,006) with dozens of questions ranging from background and demographics to behavioral patterns and health information. We then obtained VR motion samples of each user playing the game “Beat Saber,” and attempted to infer their survey responses using just their head and hand motion patterns. Using simple machine learning models, over 40 personal attributes could be accurately and consis- tently inferred from VR motion data alone. Despite this significant observed leakage, there remains limited awareness of the privacy implications of VR motion data, highlighting the pressing need for privacy-preserving mechanisms in multi-user VR applications.

Citation

Vivek Nair, Christian Rack, Wenbo Guo, Rui Wang, Shuixian Li, Brandon Huang, Atticus Cull, James F. O'Brien, Marc Latoschik, Louis Rosenberg, and Dawn Song. "Inferring Private Personal Attributes of Virtual Reality Users from Ecologically Valid Head and Hand Motion Data". In IEEE Conference on Virtual Reality and 3D User Interfaces, pages 477–484. IEEE Computer Society, March 2024.

Supplemental Material

Code

Source code for Deep Motion Masking