To content
Department of Rehabilitation Sciences
Multi-Modal Shared Control

Comparative Evaluation of Visual, Auditory and Vibrotactile Cues for Adaptive DoF Mapping in Assistive Robotics

Motivation

People with severe physical disabilities often depend on robotic assistance to carry out everyday tasks such as eating or drinking. Assistive robotic arms offer significant potential for fostering autonomy and social participation. However, controlling these systems remains a key challenge, particularly when users must manually operate multiple degrees of freedom (DoF). Existing shared control systems aim to reduce this burden by suggesting context-relevant movements. Yet, these suggestions are typically conveyed visually — posing significant accessibility barriers for users with limited visual perception.

Goals and Approach

This project explores how adaptive robot suggestions can be made more accessible and comprehensible through multi-modal feedback. It systematically compares visual, auditory, and vibrotactile cues within a shared control framework. The technical foundation is the modular AdaptiX XR platform, which will be extended with a novel vibrotactile interface based on a patented method (DE102022122173B4) that delivers spatial information directly via the skin. In a mixed-reality-based user study, the project will evaluate each modality’s impact on task performance, cognitive load, and user acceptance using both quantitative and qualitative methods.

Innovations and Perspectives

The project introduces a novel method for non-visual, spatial communication of robotic suggestions and offers the first direct comparison of visual, auditory, and vibrotactile modalities in this context. Results will inform inclusive and adaptive interaction design in human-robot collaboration and support the development of accessible assistive technologies. The findings will shed light on adaptive, explainable, and multimodal shared control systems. Long-term, the extended AdaptiX framework will be made available as an open research platform to support systematic investigations in inclusive HRI.

Duration

10/2025 - 12/2027

The project is funded by the Young Academy of TU Dortmund University

© Schulte-Linnemann ​​/​​ TU Dortmund

Contact