Open Source Software

Koc-TUM Physical Human-X Interaction Repository

Human Robot Interaction Setup

This repository contains real (i.e. not simulated) haptic interaction data collected from both Human-Human and Human-Robot dyads in joint object manipulation scenarios. Both scenarios were realized at the ITR and supported in part within the DFG excellence initiative research cluster "Cognition for Technical Systems - CoTeSys" ( The scenarios involve two agents carrying a large table on ball casters in a laboratory setting. Force and position information is recorded during interaction.


The dataset is collected as a result of joint research between Technische Universität München (ITR) and Koc University (Robotics and Mechatronics Laboratory and Intelligent User Interfaces Laboratory). The copyright of the data remains with these institutions.

Please note that this dataset is available for research purposes only. If you are interested in using the dataset, please do so by citing the following paper:


A. Mörtl, M. Lawitzky, A. Kucukyilmaz, M. Sezgin, C. Basdogan, S. Hirche, "The Role of Roles: Physical Cooperation between Humans and Robots," in International Journal of Robotics Research (IJRR), vol. 31 (13), 2012, pp. 1657-1675 [BibTeX], [avi]


The experimental procedure, and the details of how data is collected can be found in the aforementioned paper. May you have any queries, please direct them to Ayse Kucukyilmaz via e-mail.

Koc-TUM pHRI (physical Human-Robot Interaction) Dataset


This set consists of data collected from 18 human-robot dyads. For each dyad, the data is recorded at 1 kHz, smoothed with low-pass filtering at 15 Hertz cutoff frequency, and stored as a Matlab struct.



Koc-TUM pHHI (physical Human-Human Interaction) Dataset


This set consists of data collected from 6 human-human dyads. For each dyad, the data is low-pass filtered, downsampled to 25Hz, and aligned. Motion is estimated by a Kalman-filter fusioning gyro and visual tracking data.