Object-based land cover classification using airborne LiDAR

A. S. Antonarakis, James Brasington, Keith S. Richards

Research output: Contribution to journalArticlepeer-review

306 Citations (SciVal)


Light Detection and Ranging (LiDAR) provides high resolution horizontal and vertical spatial point cloud data, and is increasingly being used in a number of applications and disciplines, which have concentrated on the exploit and manipulation of the data using mainly its three dimensional nature. LiDAR information potential is made even greater though, with its consideration of intensity. Elevation and intensity airborne LiDAR data are used in this study in order to classify forest and ground types quickly and efficiently without the need for manipulating multispectral image files, using a supervised object-orientated approach. LiDAR has the advantage of being able to create elevation surfaces that are in 3D, while also having information on LiDAR intensity values, thus it is a spatial and spectral segmentation tool. This classification method also uses point distribution frequency criteria to differentiate between land cover types. Classifications were performed using two methods, one that included the influence of the ground in heavily vegetated areas, and the other which eliminated the ground points before classification. The classification of three meanders of the Garonne and Allier rivers in France has demonstrated overall classification accuracies of 95% and 94% for the methods including and excluding the ground influence respectively. Five types of riparian forest were classified with accuracies between 66 and 98%. These forest types included planted and natural forest stands of different ages. Classifications of short vegetation and bare earth also produced high accuracies averaging above 90%.
Original languageEnglish
Pages (from-to)2988-2998
Number of pages11
JournalRemote Sensing of Environment
Publication statusPublished - 16 Jun 2008


Dive into the research topics of 'Object-based land cover classification using airborne LiDAR'. Together they form a unique fingerprint.

Cite this