Support vector machines and object-based classification for obtaining land-use/cover cartography from Hyperion hyperspectral imagery

George Petropoulos, C. Kalaitzidis, Krishna P. Vadrevu

Research output: Contribution to journalArticlepeer-review

205 Citations (Scopus)

Abstract

The Hyperion hyperspectral sensor has the highest spectral resolution, acquiring spectral information of Earth's surface objects in 242 spectral bands at a spatial resolution of 30 m. In this study, we evaluate the performance of the Hyperion sensor in conjunction with the two different classification algorithms for delineating land use/cover in a typical Mediterranean setting. The algorithms include pixel-based support vector machines (SVMs) and the object-based classification algorithm. Validation of the derived land-use/cover maps from the above two algorithms was performed through error matrix statistics using the validation points from the very high resolution QuickBird imagery. Results suggested both classifiers as highly useful in mapping land use/cover in the study region with the object-based approach slightly outperforming the SVMs classification by overall higher classification accuracy and Kappa statistics. Results from the statistical significance testing using McNemar's chi-square test confirmed the superiority of the object-oriented approach compared to SVM. The relative strengths and weaknesses of the two classification algorithms for land-use/cover mapping studies are highlighted. Overall, our results underline the potential of hyperspectral remote sensing data together with an object-based classification approach for mapping land use/cover in the Mediterranean regions.
Original languageEnglish
Pages (from-to)99–107
Number of pages8
JournalComputers and Geosciences
Volume41
DOIs
Publication statusPublished - Apr 2012

Fingerprint

Dive into the research topics of 'Support vector machines and object-based classification for obtaining land-use/cover cartography from Hyperion hyperspectral imagery'. Together they form a unique fingerprint.

Cite this