Organized a tutorial called Solving the Sensor-based Activity Recognition Problem (SOAR) in Ubicomp 2023!
We organized a tutorial called Solving the Sensor-based Activity Recognition Problem (SOAR) – Self-supervised, Multi-modal Recognition of Activities from Wearable Sensors at Ubicomp 2023 in Cancun, Mexico.
Overview
Feature extraction lies at the core of Human Activity Recognition (HAR): the automated inference of what activity is being performed. Traditionally, the HAR community used statistical metrics and distribution-based representations to summarize the movement present in windows of sensor data into feature vectors. More recently, learned representations have been used successfully in lieu of such handcrafted and manually engineered features. In particular, the community has shown substantial interest in self-supervised methods, which leverage large-scale unlabeled data to first learn useful representations that are subsequently fine-tuned to the target applications. In this tutorial, we focus on representations for single-sensor and multi-modal setups, and go beyond the current de facto of learning representations. We also discuss the economic use of existing representations, specifically via transfer learning and domain adaptation. The proposed tutorial will introduce state-of-the-art methods for representation learning in HAR, and provide a forum for researchers from mobile and ubiquitous computing to not only discuss the current state of the field but to also chart future directions for the field itself, including answering what it would take to finally solve the activity recognition problem.