This paper describes the design and current state of development of a multimodal user interface enabling exploration of remote environments by means of a teleoperated robot. Human-robot interaction consists of visual and tactile feedback to the user and gesture-based tele-operation of the robot The user interface includes surface reconstruction of the remote environment integrating data sampled by tactile and proximity sensors, cutaneous perception of distance information through the vibrotactile actuators integrated in a virtual reality glove (the CyberTouch, from Immersion Corp., Inc.), and an optional streaming of video information for bandwidth-rich settings. Preliminary results show that the VR glove enables the operator to drive the exploration of the remote environment in a natural fashion and to react promptly to the sensed information.

Multimodal user interface for remote object exploration with sparse sensory data

REGGIANI, MONICA
2002

Abstract

This paper describes the design and current state of development of a multimodal user interface enabling exploration of remote environments by means of a teleoperated robot. Human-robot interaction consists of visual and tactile feedback to the user and gesture-based tele-operation of the robot The user interface includes surface reconstruction of the remote environment integrating data sampled by tactile and proximity sensors, cutaneous perception of distance information through the vibrotactile actuators integrated in a virtual reality glove (the CyberTouch, from Immersion Corp., Inc.), and an optional streaming of video information for bandwidth-rich settings. Preliminary results show that the VR glove enables the operator to drive the exploration of the remote environment in a natural fashion and to react promptly to the sensed information.
2002
IEEE Int. Workshop on Robot and Human Interactive Communication, ROMAN 2002
File in questo prodotto:
Non ci sono file associati a questo prodotto.
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11577/188441
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 3
  • ???jsp.display-item.citation.isi??? ND
social impact