User Interaction Layer for Mixed Reality Robot Programming

Context

This project addresses the fundamental challenge of programming and controlling robot swarms through intuitive interfaces. We operate in the domain of Mixed Reality robotics interfaces, spatial computing, and multi-robot coordination systems. Traditional robot programming requires complex coding or abstract 2D interfaces that disconnect users from the spatial nature of robotic coordination. This work contributes to the broader SwarmOps research initiative by developing the human-machine interface components necessary for effective human oversight of collaborative cyber-physical systems.

Motivation

Programming robot swarms currently demands expertise in robotics-specific programming languages and complex abstract thinking to translate spatial coordination into code. This creates a barrier preventing domain experts (who understand the tasks robots should perform) from directly programming robot behaviors. Existing graphical interfaces remain stuck with 2D paradigms that poorly represent the 3D spatial relationships crucial for multi-robot coordination. Mixed Reality technology offers the potential for direct spatial manipulation of robot behaviors, but the interaction methods remain underdeveloped. Without intuitive programming interfaces, the deployment of robot swarms remains limited to specialists rather than expanding to domain experts who could benefit most from robotic automation.

Goal

The student will create a user interaction layer for natural programming of robot systems in Mixed Reality environments. The system will include a gesture recognition system supporting intuitive hand-based robot control commands. Voice command processing will handle natural language robot instructions. Spatial manipulation interfaces will allow direct 3D positioning and path planning for robots. A visual programming language adapted for 3D spatial interaction will replace traditional 2D paradigms. Multi-modal feedback systems will combine visual, auditory, and haptic responses. Formation design tools will use drag-and-drop interaction in 3D space. User experience optimization will address ergonomic factors and learning curves. User studies will compare MR programming efficiency with traditional interfaces.

Requirements

Pointers

Contact