Skip to content

Files

Latest commit

9a8d77f · Jan 9, 2024

History

History
60 lines (48 loc) · 4.22 KB

README.md

File metadata and controls

60 lines (48 loc) · 4.22 KB

Centaur: Robust Multimodal Fusion for Human Activity Recognition

This is an image This is an image

This repository contains the implementation of the paper entitled "Centaur: Robust Multimodal Fusion for Human Activity Recognition".

Directories

Instructions

  • To train Centaur's data cleaning module use DE-train-PAMAP2.ipynb for PAMAP2 dataset. Similarly Centaur can be trained on Opportunity and HHAR dataset by choosing the appropriate files from Centaur directory.
  • To train Centaur's self-attention CNN module for HAR use Eval-PAMAP2-ConvAttn.ipynb
  • To analyze Centaur's End-to-end robust multimodal fusion performance on PAMAP2 dataset use DE-test-PAMAP2.ipynb. You would need to insert the paths generated after training the data cleaning and attention module for evaluation of the model.

Datasets

Scripts to the preprocessed data can be found in the appropriate directories. The original (not preprocessed) datasets can be found at the following links:

Dependencies

Package Version
Python3 3.8.13
PyTorch 1.10.2
TensorFlow 2.8.0
scikit-learn 1.1.2

License

Refer to the file LICENCE

Citation

Sanju Xaviar, Xin Yang and Omid Ardakanian. 2023. Robust Multimodal Fusion for Human Activity Recognition, preprint.

@misc{https://doi.org/10.48550/arxiv.2303.04636,
  doi = {10.48550/ARXIV.2303.04636},
  url = {https://arxiv.org/abs/2303.04636}, 
  author = {Xaviar, Sanju and Yang, Xin and Ardakanian, Omid}, 
  keywords = {Machine Learning (cs.LG), Signal Processing (eess.SP), FOS: Computer and information sciences, FOS: Computer and information sciences, FOS:  Electrical engineering, electronic engineering, information engineering, FOS: Electrical engineering, electronic engineering, information engineering},
  title = {Robust Multimodal Fusion for Human Activity Recognition},
  publisher = {arXiv},
  year = {2023},
  copyright = {arXiv.org perpetual, non-exclusive license}
}