This paper describes a visual feature detector and descriptor scheme designed to address the specific problems of humanoid robots in the tasks of visual odometry, localization, and SLAM (Simultaneous Localization And Mapping). During walking, turning, and squatting movements, the camera of a humanoid robot moves in jerky and sometimes unpredictable way. This causes an undesired motion blur in the images grabbed by the robot camera, that negatively affects the performance of the image processing algorithms. Indeed, the classical features detector and descriptor filtering techniques, that proved to work so well for wheeled robots, do not perform so reliably in humanoid robots. This paper presents a method to detect image interest points (invariant to scale transformation and rotations) robust to motion-blur introduced by the camera motion. Our approach is based on a preprocessing step to estimate the point spread function (PSF) of the motion blur. The PSF is used to deconvolve the image reducing the blur. Then, we apply a feature detector inspired by SURF approach and the feature descriptor from SIFT. Experiments performed on standard datasets corrupted with motion blur and on images taken by a camera mounted on a small humanoid robot show the effectiveness of the proposed technique. Our approach presents higher performances and higher reliability in matching features in the different images of a sequence affected by motion-blur.

Reliable Features Matching for Humanoid Robots

PRETTO, ALBERTO;MENEGATTI, EMANUELE;PAGELLO, ENRICO
2007

Abstract

This paper describes a visual feature detector and descriptor scheme designed to address the specific problems of humanoid robots in the tasks of visual odometry, localization, and SLAM (Simultaneous Localization And Mapping). During walking, turning, and squatting movements, the camera of a humanoid robot moves in jerky and sometimes unpredictable way. This causes an undesired motion blur in the images grabbed by the robot camera, that negatively affects the performance of the image processing algorithms. Indeed, the classical features detector and descriptor filtering techniques, that proved to work so well for wheeled robots, do not perform so reliably in humanoid robots. This paper presents a method to detect image interest points (invariant to scale transformation and rotations) robust to motion-blur introduced by the camera motion. Our approach is based on a preprocessing step to estimate the point spread function (PSF) of the motion blur. The PSF is used to deconvolve the image reducing the blur. Then, we apply a feature detector inspired by SURF approach and the feature descriptor from SIFT. Experiments performed on standard datasets corrupted with motion blur and on images taken by a camera mounted on a small humanoid robot show the effectiveness of the proposed technique. Our approach presents higher performances and higher reliability in matching features in the different images of a sequence affected by motion-blur.
Proc. of IEEE-RAS International Conference on Humanoid Robots (Humanoids 2007)
9781424418619
File in questo prodotto:
Non ci sono file associati a questo prodotto.
Pubblicazioni consigliate

Caricamento pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: http://hdl.handle.net/11577/1780202
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 9
  • ???jsp.display-item.citation.isi??? 7
social impact