Millimeter-wave (mmWave) radar sensors are emerging as valid alternatives to cameras for the pervasive contactless monitoring of people in indoor spaces. However, commercial mmWave radars feature a limited range (up to 6–8 m) and are subject to occlusion, which may constitute a significant drawback in large, crowded rooms characterized by a challenging multipath environment. Thus, covering large indoor spaces requires multiple radars with known relative position and orientation and algorithms to combine their outputs. In this work, we present ORACLE, an autonomous system that: 1) integrates automatic relative position and orientation estimation from multiple radar devices by exploiting the trajectories of people moving freely in the radars’ common fields of view and 2) fuses the tracking information from multiple radars to obtain a unified tracking among all sensors. Our implementation and experimental evaluation of ORACLE results in median errors of 0.12 m and 0.03° for radar location and orientation estimates, respectively. Fused tracking improves the mean target tracking accuracy by 27% and the mean tracking error is 23 cm in the most challenging case of three moving targets. Finally, ORACLE does not show significant performance reduction when the fusion rate is reduced to up to 1/5 of the frame rate of the single radar sensors, thus being amenable to a lightweight implementation on a resource-constrained fusion center (FC).

ORACLE: Occlusion-Resilient and Self-Calibrating mmWave Radar Network for People Tracking

Canil, Marco
Investigation
;
Pegoraro, Jacopo
Investigation
;
Casari, Paolo
Investigation
;
Rossi, Michele
Investigation
2024

Abstract

Millimeter-wave (mmWave) radar sensors are emerging as valid alternatives to cameras for the pervasive contactless monitoring of people in indoor spaces. However, commercial mmWave radars feature a limited range (up to 6–8 m) and are subject to occlusion, which may constitute a significant drawback in large, crowded rooms characterized by a challenging multipath environment. Thus, covering large indoor spaces requires multiple radars with known relative position and orientation and algorithms to combine their outputs. In this work, we present ORACLE, an autonomous system that: 1) integrates automatic relative position and orientation estimation from multiple radar devices by exploiting the trajectories of people moving freely in the radars’ common fields of view and 2) fuses the tracking information from multiple radars to obtain a unified tracking among all sensors. Our implementation and experimental evaluation of ORACLE results in median errors of 0.12 m and 0.03° for radar location and orientation estimates, respectively. Fused tracking improves the mean target tracking accuracy by 27% and the mean tracking error is 23 cm in the most challenging case of three moving targets. Finally, ORACLE does not show significant performance reduction when the fusion rate is reduced to up to 1/5 of the frame rate of the single radar sensors, thus being amenable to a lightweight implementation on a resource-constrained fusion center (FC).
2024
File in questo prodotto:
Non ci sono file associati a questo prodotto.
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11577/3508712
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? ND
social impact