Increasingly advanced and affordable close-range sensing techniques are employed by an ever-broadening range of users, with varying competence and experience. In this context a method was tested that uses photogrammetry and classification by machine learning to divide a point cloud into different surface type classes. The study site is a peat scarp 20 metres long in the actively eroding river bank of the Rotmoos valley near Obergurgl, Austria. Imagery from near-infra red (NIR) and conventional (RGB) sensors, georeferenced with coordinates of targets surveyed with a total station, was used to create a point cloud using structure from motion and dense image matching. NIR and RGB information were merged into a single point cloud and 18 geometric features were extracted using three different radii (0.02 m, 0.05 m and 0.1 m) totalling 58 variables on which to apply the machine learning classification. Segments representing six classes, dry grass, green grass, peat, rock, snow and target, were extracted from the point cloud and split into a training set and a testing set. A Random Forest machine learning model was trained using machine learning packages in the R-CRAN environment. The overall classification accuracy and Kappa Index were 98% and 97% respectively. Rock, snow and target classes had the highest producer and user accuracies. Dry and green grass had the highest omission (1.9% and 5.6% respectively) and commission errors (3.3% and 3.4% respectively). Analysis of feature importance revealed that the spectral descriptors (NIR, R, G, B) were by far the most important determinants followed by verticality at 0.1 m radius.

Machine learning for classification of an eroding scarp surface using terrestrial photogrammetry with nir and rgb imagery

Pirotti F.
Supervision
2020

Abstract

Increasingly advanced and affordable close-range sensing techniques are employed by an ever-broadening range of users, with varying competence and experience. In this context a method was tested that uses photogrammetry and classification by machine learning to divide a point cloud into different surface type classes. The study site is a peat scarp 20 metres long in the actively eroding river bank of the Rotmoos valley near Obergurgl, Austria. Imagery from near-infra red (NIR) and conventional (RGB) sensors, georeferenced with coordinates of targets surveyed with a total station, was used to create a point cloud using structure from motion and dense image matching. NIR and RGB information were merged into a single point cloud and 18 geometric features were extracted using three different radii (0.02 m, 0.05 m and 0.1 m) totalling 58 variables on which to apply the machine learning classification. Segments representing six classes, dry grass, green grass, peat, rock, snow and target, were extracted from the point cloud and split into a training set and a testing set. A Random Forest machine learning model was trained using machine learning packages in the R-CRAN environment. The overall classification accuracy and Kappa Index were 98% and 97% respectively. Rock, snow and target classes had the highest producer and user accuracies. Dry and green grass had the highest omission (1.9% and 5.6% respectively) and commission errors (3.3% and 3.4% respectively). Analysis of feature importance revealed that the spectral descriptors (NIR, R, G, B) were by far the most important determinants followed by verticality at 0.1 m radius.
2020
ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences
File in questo prodotto:
File Dimensione Formato  
Brožová et al. - Unknown - MACHINE LEARNING FOR CLASSIFICATION OF AN ERODING SCARP SURFACE USING TERRESTRIAL PHOTOGRAMMETRY WITH NIR AND.pdf

accesso aperto

Tipologia: Published (publisher's version)
Licenza: Creative commons
Dimensione 377.42 kB
Formato Adobe PDF
377.42 kB Adobe PDF Visualizza/Apri
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11577/3378485
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 3
  • ???jsp.display-item.citation.isi??? ND
social impact