Skip to content

HAPT (Human Activities and Postural Transitions)

Summary

HAPT augments the classic UCI HAR dataset with additional postural transitions captured using smartphone sensors fixed to the waist. It enables models that discriminate subtle transitions (e.g., sitting to standing) and remains a staple for benchmarking lightweight sequence models.

Reference Paper

  • Davide Anguita et al. "A Public Domain Dataset for Human Activity Recognition Using Smartphones." ESANN, 2013. PDF

Benchmarks & Baselines

  • SVM with hand-crafted features - Accuracy: 96.3%; Anguita et al., 2013.
  • DeepConvLSTM - Accuracy: 94.6% (10-fold CV); widely reproduced baseline.
  • Evaluation typically reports subject-wise stratified 70/30 splits; ensure transitions are not leaked between sets.

Tooling & Ecosystem

Known Challenges

  • Device placement is fixed (waist), limiting generalization to free-placement scenarios.
  • Postural transitions are shorter than activities; windowing strategy affects performance.
  • Sensor noise requires filtering (Butterworth) prior to feature extraction for classical pipelines.

Cite

@inproceedings{anguita2013hapt,
  title     = {A Public Domain Dataset for Human Activity Recognition Using Smartphones},
  author    = {Anguita, Davide and Ghio, Alessandro and Oneto, Luca and others},
  booktitle = {Proceedings of the 21th International European Symposium on Artificial Neural Networks},
  year      = {2013}
}