The extraction of depth information associated to dynamic scenes is an intriguing topic, because of its perspective role in many applications, including free viewpoint and 3D video systems. Time-of-flight (ToF) range cameras allow for the acquisition of depth maps at video rate, but they are characterized by a limited resolution, specially if compared with standard color cameras. This paper presents a super-resolution method for depth maps that exploits the side information from a standard color camera: the proposed method uses a segmented version of the high-resolution color image acquired by the color camera in order to identify the main objects in the scene and a novel surface prediction scheme in order to interpolate the depth samples provided by the ToF camera. Effective solutions are provided for critical issues such as the joint calibration between the two devices and the unreliability of the acquired data. Experimental results on both synthetic and real-world scenes have shown how the proposed method allows to obtain a more accurate interpolation with respect to standard interpolation approaches and state-of-the-art joint depth and color interpolation schemes.

Edge-preserving interpolation of depth data exploiting color information

DAL MUTTO, CARLO;ZANUTTIGH, PIETRO;CORTELAZZO, GUIDO MARIA
2013

Abstract

The extraction of depth information associated to dynamic scenes is an intriguing topic, because of its perspective role in many applications, including free viewpoint and 3D video systems. Time-of-flight (ToF) range cameras allow for the acquisition of depth maps at video rate, but they are characterized by a limited resolution, specially if compared with standard color cameras. This paper presents a super-resolution method for depth maps that exploits the side information from a standard color camera: the proposed method uses a segmented version of the high-resolution color image acquired by the color camera in order to identify the main objects in the scene and a novel surface prediction scheme in order to interpolate the depth samples provided by the ToF camera. Effective solutions are provided for critical issues such as the joint calibration between the two devices and the unreliability of the acquired data. Experimental results on both synthetic and real-world scenes have shown how the proposed method allows to obtain a more accurate interpolation with respect to standard interpolation approaches and state-of-the-art joint depth and color interpolation schemes.
File in questo prodotto:
Non ci sono file associati a questo prodotto.
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11577/2686682
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 7
  • ???jsp.display-item.citation.isi??? 5
social impact