Recent advances in the field of laser scanning technology along with the availability of more powerful computing resources have favoured the increasing interest of surveyors, architects, archaeologists towards laser scanners as a very promising alternative for cultural heritage surveying. Thousands of points can be acquired in a few seconds with an accuracy that is adequate to build 3D models for single objects so as for whole environments. At the present, resulting 3D digital models offer an invaluable mean for documentation, archiving, structural analysis and restoration of the large amount of objects belonging to our historical and cultural heritage. Usually, the end products of the whole workflow (survey and modeling) are VR representations (Vrml, Flash), movies (AVI, DVx, Mpeg), Digital Surface Models (DSM) and orthophotos as well. The creation of a 3D model requires a lot of data about the object surface or volume, which have then to be aggregated, regardless the data format and the acquisition device used. In most cases, the data registration step is based on ICP, that iterativelly finds the mutual orientation between two range maps, starting from an initial guess given by an operator. This approach is often time-consuming, increases the final cost of the 3D model and represents the major limit to the wide spreading of real object models. In this paper an overview of our automatic range data registration system is presented, focusing on the integration between the two main blocks. In the first one, overlapping areas between range image pairs are detected by mean of spin-images and an initial approximate alignement between image pairs is computed. Then, in the second block, a refinement of this estimate is performed by use of a cascade of two registration algorithms: the Frequency Domain and the ICP.

Towards automatic modeling for cultural heritage applications

GUARNIERI, ALBERTO;VETTORE, ANTONIO
2003

Abstract

Recent advances in the field of laser scanning technology along with the availability of more powerful computing resources have favoured the increasing interest of surveyors, architects, archaeologists towards laser scanners as a very promising alternative for cultural heritage surveying. Thousands of points can be acquired in a few seconds with an accuracy that is adequate to build 3D models for single objects so as for whole environments. At the present, resulting 3D digital models offer an invaluable mean for documentation, archiving, structural analysis and restoration of the large amount of objects belonging to our historical and cultural heritage. Usually, the end products of the whole workflow (survey and modeling) are VR representations (Vrml, Flash), movies (AVI, DVx, Mpeg), Digital Surface Models (DSM) and orthophotos as well. The creation of a 3D model requires a lot of data about the object surface or volume, which have then to be aggregated, regardless the data format and the acquisition device used. In most cases, the data registration step is based on ICP, that iterativelly finds the mutual orientation between two range maps, starting from an initial guess given by an operator. This approach is often time-consuming, increases the final cost of the 3D model and represents the major limit to the wide spreading of real object models. In this paper an overview of our automatic range data registration system is presented, focusing on the integration between the two main blocks. In the first one, overlapping areas between range image pairs are detected by mean of spin-images and an initial approximate alignement between image pairs is computed. Then, in the second block, a refinement of this estimate is performed by use of a cascade of two registration algorithms: the Frequency Domain and the ICP.
File in questo prodotto:
Non ci sono file associati a questo prodotto.
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11577/1374607
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact