Vsens Toolkit: AR-based Open-ended System for Virtual Sensors
Fengzhou Liang, Tian Min, Chengshuo Xia, and Yuta Sugiura
In Proceedings of the Extended Abstracts of the CHI Conference on Human Factors in Computing Systems, 2025
The emergence of virtual sensors in recent years has opened up new possibilities for the development of human activity recognition (HAR) systems. For instance, we can synthesized virtual sensor data for those scarce datasets, such as accelerometer data, from the widely available multimedia resources online through cross-modal approaches. However, existing solutions on virtual sensors primarily focus on batch pipelines, relying heavily on lengthy processing workflows and sophisticated computer vision techniques, which often lack interactivity and flexibility for the usage in customized and small-scale scenarios. In this work, we present the Vsens Toolkit, an AR-based open-ended system for virtual sensors, which serves as an preliminary exploration of the user interface for virtual sensors. It integrates functionalities such as scene construction, data collection, data augmentation, and visualization. In this interactivity demonstration, we showcase exemplar scenarios including wearable accelerometers, capacitive sensing, wrist-worn sensor tracking, and sandbox for free exploration (Figure 1).