Head and gaze control of a telepresence robot with an HMD

John Paulin Hansen, Zhongyu Wang, Alexandre Alapetite, Katsumi Minakata, Martin Thomsen, Guangtao Zhang

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

Abstract

Gaze interaction with telerobots is a new opportunity for wheelchair users with severe motor disabilities. We present a video showing how head-mounted displays (HMD) with gaze tracking can be used to monitor a robot that carries a 360? video camera and a microphone. Our interface supports autonomous driving via way-points on a map, along with gaze-controlled steering and gaze typing. It is implemented with Unity, which communicates with the Robot Operating System (ROS).

Original languageEnglish
Title of host publicationProceedings - ETRA 2018 : 2018 ACM Symposium on Eye Tracking Research and Applications
Number of pages3
VolumePart F137344
PublisherAssociation for Computing Machinery
Publication date2018
Article numbera82
ISBN (Electronic)9781450357067
DOIs
Publication statusPublished - 2018
Event10th ACM Symposium on Eye Tracking Research & Applications (ETRA 2018) - Warsaw , Poland
Duration: 14 Jun 201817 Jun 2018

Conference

Conference10th ACM Symposium on Eye Tracking Research & Applications (ETRA 2018)
Country/TerritoryPoland
CityWarsaw
Period14/06/201817/06/2018

Keywords

  • Accessibility
  • Assistive technology
  • Experience prototyping
  • Gaze interaction
  • Head-mounted displays
  • Human-robot interaction
  • Telepresence
  • Telerobot
  • Virtual reality

Fingerprint

Dive into the research topics of 'Head and gaze control of a telepresence robot with an HMD'. Together they form a unique fingerprint.

Cite this