Surveys & Benchmark Papers¶
Curated list of survey papers, benchmark overviews, and meta-analyses that contextualize human activity understanding across modalities.
General Human Activity Recognition¶
- Aggarwal, J.K. and Ryoo, M.S. "Human Activity Analysis: A Review." ACM Computing Surveys, 2011.
- Lara, O.D. and Labrador, M.A. "A Survey on Human Activity Recognition using Wearable Sensors." IEEE Communications Surveys & Tutorials, 2013.
- Bulling, A., Blanke, U., Schiele, B. "A Tutorial on Human Activity Recognition Using Body-Worn Inertial Sensors." ACM Computing Surveys, 2014.
- Li, X. et al. "A Systematic Survey on Deep Learning for Human Activity Recognition." ACM Computing Surveys, 2022.
- Aviles-Cruz, C. et al. "Human Activity Recognition Using Deep Learning: A Review." Applied Sciences, 2023.
- Yadav, S.K. et al. "A Review of Multimodal Human Activity Recognition with Special Emphasis on Classification, Applications, Challenges and Future Directions." Knowledge-Based Systems, 2024.
- Mim, T.R. et al. "GAN-Based Data Augmentation for Human Activity Recognition: A Comprehensive Survey." IEEE Access, 2024.
Video-based Action Recognition¶
- Herath, S. et al. "Going Deeper into Action Recognition: A Survey." Image and Vision Computing, 2017.
- Wu, Z. et al. "Compressed Video Action Recognition." CVPR, 2018. (Discussion of efficiency benchmarks.)
- Zhang, H. et al. "Revisiting Video Saliency: A Large-scale Benchmark and a Novel Model." CVPR, 2018.
- Liu, B. et al. "Deep Learning for Skeleton-based Human Action Recognition: A Survey." IEEE TPAMI, 2022.
Skeleton & Pose-based Understanding¶
- Presti, L.L. and La Cascia, M. "3D Skeleton-based Human Action Classification: A Survey." Pattern Recognition, 2016.
- Chen, C. et al. "A Survey on Deep Learning for Skeleton-based Human Activity Recognition." IEEE CSVT, 2020.
- Zheng, C. et al. "Deep Learning for Human Pose Estimation: A Survey." arXiv, 2023.
Wearable & Multimodal Sensing¶
- Preece, S. et al. "Activity Identification Using Body-Mounted Sensors-A Review of Classification Techniques." Physiological Measurement, 2009.
- Wang, J. et al. "Deep Learning for Sensor-based Activity Recognition: A Survey." Pattern Recognition Letters, 2019.
- Guan, Y. and Plotz, T. "Ensembles of Deep LSTM Learners for Activity Recognition using Wearables." Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, 2017. (Benchmark insights.)
- Hammerla, N. et al. "Deep, Convolutional, and Recurrent Models for Wearable Activity Recognition." IJCAI, 2016.
Egocentric & Multimodal Interaction¶
- Chaudhary, S. et al. "A Comprehensive Survey on Egocentric Vision for Human-computer Interaction." IEEE TPAMI, 2021.
- Grauman, K. et al. "Ego4D: Around the World in 2,250 Hours of Egocentric Video." CVPR, 2022.
- Damen, D. et al. "Scaling Egocentric Vision: EPIC-KITCHENS." ECCV, 2018; updated 2022.
Datasets & Leaderboards Aggregators¶
- "Papers with Code: Human Activity Recognition." (Live SOTA tracking across multiple benchmarks.)
- "Awesome Human Action Recognition." (Community-maintained list; useful for cross-referencing.)
- "Monash Ubiquitous Sensing Lab HAR Compendium." (Wearable HAR dataset catalog.)
How to Use¶
- Reference these surveys when motivating dataset gaps or research directions.
- Include relevant papers in dataset cards under "Benchmarks & Baselines" when they establish evaluation standards.
- Suggest new survey entries by opening a documentation PR referencing DOI/arXiv links.