Open Source Software
Koc-TUM Physical Human-X Interaction Repository
This repository contains real (i.e. not simulated) haptic interaction data collected from both Human-Human and Human-Robot dyads in joint object manipulation scenarios. Both scenarios were realized at the ITR and supported in part within the DFG excellence initiative research cluster "Cognition for Technical Systems - CoTeSys" (www.cotesys.org). The scenarios involve two agents carrying a large table on ball casters in a laboratory setting. Force and position information is recorded during interaction.
The dataset is collected as a result of joint research between Technische Universität München (ITR) and Koc University (Robotics and Mechatronics Laboratory and Intelligent User Interfaces Laboratory). The copyright of the data remains with these institutions.
Please note that this dataset is available for research purposes only. If you are interested in using the dataset, please do so by citing the following paper:
A. Mörtl, M. Lawitzky, A. Kucukyilmaz, M. Sezgin, C. Basdogan, S. Hirche, "The Role of Roles: Physical Cooperation between Humans and Robots," in International Journal of Robotics Research (IJRR), vol. 31 (13), 2012, pp. 1657-1675 [BibTeX], [avi]
Koc-TUM pHRI (physical Human-Robot Interaction) Dataset
This set consists of data collected from 18 human-robot dyads. For each dyad, the data is recorded at 1 kHz, smoothed with low-pass filtering at 15 Hertz cutoff frequency, and stored as a Matlab struct.
- Readme.txt (2.10 KB)
- Interaction data
- KocTUM_HR_data.zip (2.54 GB)
- KocTUM_HR_videos.zip (1.43 GB)
- Source code for generating video files
- simulateTrialsHRI.zip (7.73 KB)
Koc-TUM pHHI (physical Human-Human Interaction) Dataset
This set consists of data collected from 6 human-human dyads. For each dyad, the data is low-pass filtered, downsampled to 25Hz, and aligned. Motion is estimated by a Kalman-filter fusioning gyro and visual tracking data.