Skip to the content.

MultiPhysio‑HRC: Multimodal Physiological Signals Dataset for Industrial Human‑Robot Collaboration

Official companion repository for the paper “MultiPhysio‑HRC: Multimodal Physiological Signals Dataset for Industrial Human‑Robot Collaboration.” This repo hosts code, docs, and assets to reproduce our preprocessing, feature extraction, and baseline models.

arXiv GitHub Repo Dataset Cite this


✨ TL;DR


Access

Dataset: https://zenodo.org/records/17225571

Paper (PDF): https://arxiv.org/abs/2510.00703

GitHub (code, preprocessing, baselines): https://github.com/automation-robotics-machines/MultiPhysio-HRC


Overview

MultiPhysio‑HRC is a multimodal dataset and toolkit for mental‑state estimation in industrial Human‑Robot Collaboration (HRC). It combines physiological signals (EEG, ECG, EDA, EMG, respiration) with voice and facial action units, collected under realistic HRC, manual tasks, VR stressors, cognitive tasks, and rest.

Use it to study stress, cognitive load, valence/arousal/dominance, and to build human‑aware adaptive robotic systems.


Highlights


Media

Figures and demo videos will be added here.


Dataset at a Glance

Participants

Sessions & Tasks

Modalities

Ground truth

Ethics


Citation

If you use MultiPhysio‑HRC or this code, please cite the paper:

@misc{bussolan2025multiphysiohrcmultimodalphysiologicalsignals,
      title={MultiPhysio-HRC: Multimodal Physiological Signals Dataset for industrial Human-Robot Collaboration}, 
      author={Andrea Bussolan and Stefano Baraldo and Oliver Avram and Pablo Urcola and Luis Montesano and Luca Maria Gambardella and Anna Valente},
      year={2025},
      eprint={2510.00703},
      archivePrefix={arXiv},
      primaryClass={cs.RO},
      url={https://arxiv.org/abs/2510.00703}, 
}

Acknowledgments & Funding


Ethics & License

This dataset was collected under institutional ethical approval (SUPSI), with informed consent from all participants.


Contact


Maintainers: Andrea Bussolan, Stefano Baraldo.