The development of suitable tools for human-robot interaction is very important in the service robotics field. To this aim, in this paper we present a system to allows mobile robots to interact with human beings using non-visual perception. Our approach allows a human being to monitor the behaviors of a group of robots by means of non-visual perception using the acoustic channel. Thus, an acoustic localization is used as the basis for the non-visual interaction, while the non-visual perception is used also for multi-robot coordination. In this context, humans can easily understand the robots’ messages by just listening to them. Some points related to this work are worth remarking here: first acoustic localization is used as the basis for the non-visual interaction. Second, non-visual perception has been also used for multi-robot navigation. Third, human-robot interaction is restricted to non-visual monitoring. The location of acoustic sources is detected using a circular microphone array installed on each robot and a neural network. The localization information is used for avoiding collision during the robots movements. However, the localization is perturbed by uncertainties; for this reason, fuzzy rules are used for finding collision-free path for the mobile robots.
Omni-directional non-visual perception for human interactions with service robots
PAGELLO, ENRICO
2006
Abstract
The development of suitable tools for human-robot interaction is very important in the service robotics field. To this aim, in this paper we present a system to allows mobile robots to interact with human beings using non-visual perception. Our approach allows a human being to monitor the behaviors of a group of robots by means of non-visual perception using the acoustic channel. Thus, an acoustic localization is used as the basis for the non-visual interaction, while the non-visual perception is used also for multi-robot coordination. In this context, humans can easily understand the robots’ messages by just listening to them. Some points related to this work are worth remarking here: first acoustic localization is used as the basis for the non-visual interaction. Second, non-visual perception has been also used for multi-robot navigation. Third, human-robot interaction is restricted to non-visual monitoring. The location of acoustic sources is detected using a circular microphone array installed on each robot and a neural network. The localization information is used for avoiding collision during the robots movements. However, the localization is perturbed by uncertainties; for this reason, fuzzy rules are used for finding collision-free path for the mobile robots.Pubblicazioni consigliate
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.