An open-source, end-to-end toolkit for conducting behavioral XR experiments on Meta standalone head-mounted displays
Addressing technical barriers and data standardization in XR research
Widespread adoption of XR in behavioral research is hindered by high technical barriers and the absence of standardized data formats. Creating immersive experiments demands specialized programming skills, and researchers lack accessible templates designed for scientific applications.
ResXR provides a Unity-based experiment template for multimodal data capture alongside a Python processing pipeline that automates validation, preprocessing, and quality reporting, inspired by established neuroimaging tools like fMRIPrep.
Meta standalone headsets (Quest 2, Quest Pro, Quest 3)
Motion-BIDS compatible data for reproducible research
ResXR comprises four hierarchical stages spanning Unity-based data collection and Python-based post-processing
Unity-based experiment template for multimodal data capture
Quality checks and validation
Preprocessing pipeline
Reports and export