Fusion of optical, radar and waveform LiDAR observations for land cover classification

Huiran Jin, Giorgos Mountrakis

Research output: Contribution to journalArticlepeer-review

23 Scopus citations

Abstract

Land cover is an integral component for characterizing anthropogenic activity and promoting sustainable land use. Mapping distribution and coverage of land cover at broad spatiotemporal scales largely relies on classification of remotely sensed data. Although recently multi-source data fusion has been playing an increasingly active role in land cover classification, our intensive review of current studies shows that the integration of optical, synthetic aperture radar (SAR) and light detection and ranging (LiDAR) observations has not been thoroughly evaluated. In this research, we bridged this gap by i) summarizing related fusion studies and assessing their reported accuracy improvements, and ii) conducting our own case study where for the first time fusion of optical, radar and waveform LiDAR observations and the associated improvements in classification accuracy are assessed using data collected by spaceborne or appropriately simulated platforms in the LiDAR case. Multitemporal Landsat-5/Thematic Mapper (TM) and Advanced Land Observing Satellite-1/ Phased Array type L-band SAR (ALOS-1/PALSAR) imagery acquired in the Central New York (CNY) region close to the collection of airborne waveform LVIS (Land, Vegetation, and Ice Sensor) data were examined. Classification was conducted using a random forest algorithm and different feature sets in terms of sensor and seasonality as input variables. Results indicate that the combined spectral, scattering and vertical structural information provided the maximum discriminative capability among different land cover types, giving rise to the highest overall accuracy of 83% (2–19% and 9–35% superior to the two-sensor and single-sensor scenarios with overall accuracies of 64–81% and 48–74%, respectively). Greater improvement was achieved when combining multitemporal Landsat images with LVIS-derived canopy height metrics as opposed to PALSAR features, suggesting that LVIS contributed more useful thematic information complementary to spectral data and beneficial to the classification task, especially for vegetation classes. With the Global Ecosystem Dynamics Investigation (GEDI), a recently launched LiDAR instrument of similar properties to the LVIS sensor now operating onboard the International Space Station (ISS), it is our hope that this research will act as a literature summary and offer guidelines for further applications of multi-date and multi-type remotely sensed data fusion for improved land cover classification.

Original languageEnglish (US)
Pages (from-to)171-190
Number of pages20
JournalISPRS Journal of Photogrammetry and Remote Sensing
Volume187
DOIs
StatePublished - May 2022

All Science Journal Classification (ASJC) codes

  • Atomic and Molecular Physics, and Optics
  • Engineering (miscellaneous)
  • Computer Science Applications
  • Computers in Earth Sciences

Keywords

  • Accuracy
  • Fusion
  • Land cover classification
  • Optical
  • SAR
  • Waveform LiDAR

Fingerprint

Dive into the research topics of 'Fusion of optical, radar and waveform LiDAR observations for land cover classification'. Together they form a unique fingerprint.

Cite this