| Literature DB >> 22163894 |
Eetu Puttonen1, Anttoni Jaakkola, Paula Litkey, Juha Hyyppä.
Abstract
Mobile Laser Scanning data were collected simultaneously with hyperspectral data using the Finnish Geodetic Institute Sensei system. The data were tested for tree species classification. The test area was an urban garden in the City of Espoo, Finland. Point clouds representing 168 individual tree specimens of 23 tree species were determined manually. The classification of the trees was done using first only the spatial data from point clouds, then with only the spectral data obtained with a spectrometer, and finally with the combined spatial and hyperspectral data from both sensors. Two classification tests were performed: the separation of coniferous and deciduous trees, and the identification of individual tree species. All determined tree specimens were used in distinguishing coniferous and deciduous trees. A subset of 133 trees and 10 tree species was used in the tree species classification. The best classification results for the fused data were 95.8% for the separation of the coniferous and deciduous classes. The best overall tree species classification succeeded with 83.5% accuracy for the best tested fused data feature combination. The respective results for paired structural features derived from the laser point cloud were 90.5% for the separation of the coniferous and deciduous classes and 65.4% for the species classification. Classification accuracies with paired hyperspectral reflectance value data were 90.5% for the separation of coniferous and deciduous classes and 62.4% for different species. The results are among the first of their kind and they show that mobile collected fused data outperformed single-sensor data in both classification tests and by a significant margin.Entities:
Keywords: classification; data fusion; forestry; hyperspectrum; mobile laser scanning
Mesh:
Year: 2011 PMID: 22163894 PMCID: PMC3231383 DOI: 10.3390/s110505158
Source DB: PubMed Journal: Sensors (Basel) ISSN: 1424-8220 Impact factor: 3.576
Figure 1.The Sensei measurement system in its mobile mapping configuration mounted on a car. The sensors are as follow: (A). A Specim V10H line spectrometer and a mirror for viewing the Spectralon™ reference panel (not shown in figure); (B). An AVT Pike F-421C CCD camera (not used in this study); (C). A Novatel 702 GG GPS receiver; (D). An Ibeo Lux laser scanner; (E). A NovAtel SPAN-CPT Inertial Measurement Unit.
Figure 2.An indicative top-down view schematic of the measurement geometry and principle of the Ibeo Lux laser scanner (not to scale). The different colours show the different layers of the laser beam and point measured by these layers. The colours in figure are not related to hyperspectral data.
Figure 3.The Vanttila test area location is shown on a map of Finland (a) and an overview is presented of the test area (b). The overview image is drawn with the measured point cloud. The red and magenta objects are the determined trees used in classifications. Magenta objects have been included both in tree species classification and in coniferous-deciduous tree separation while red objects are only used in coniferous-deciduous tree separation. Blue areas represent the rest of the data (the map of Finland was retrieved from Wikipedia, created by user Care).
Figure 4.Data fusion process. (a) A part of the laser point cloud presenting a single tree specimen (Sorbus hybrida) set in the origin. (b) The same tree specimen after manual determination. The blue points represent the situation after 2D determination and the red points represent the outcome after an accurate 3D determination. These points are used in derivation of the height statistics of the tree specimen that were used in LiDAR-derived feature classification (Section 4.1). (c) A fused point cloud. Overlap between each determined laser point and hyperspectral pixels has been tested and all overlapping laser points have been given an individual colour spectrum. The average of all mapped spectra were used in hyperspectral classification (Section 4.2).
Classified tree species and their specimen numbers. All listed tree specimen were used in coniferous-deciduous tree separation. The tree species with the specimen number in bold were also used in individual tree species classification. Five of the specimens labelled as unidentified were deciduous and two specimens were coniferous.
| Finnish Whitebeam | |||
| Swedish Whitebeam | |||
| European Rowan | |||
| Common Whitebeam | |||
| American Mountain-ash | |||
| Pedunculate Oak | |||
| Norway Maple | 4 | ||
| Apple | 3 | ||
| Hungarian Lilac | 1 | ||
| Common Alder | 4 | ||
| Camperdown Elm | |||
| Crack Willow | |||
| Colorado Blue Spruce | |||
| Black Spruce | 4 | ||
| White Fir | 2 | ||
| Siberian Fir | |||
| Balsam Fir | 2 | ||
| Common Juniper | 2 | ||
| European Yew | 2 | ||
| Northern Whitecedar | 1 | ||
| Common Douglas-fir | 1 | ||
| Silver Birch | 1 | ||
| Scots Pine | 1 | ||
| Unidentified tree species | 7 | ||
| Total number of trees | 168 ( |
LiDAR-derived tree point cloud height distribution features and their descriptions.
| PR, hN < 0.33 | PR, hN > 0.2 | hq 30 |
| PR, 0.33 < hN < 0.67 | PR, hN > 0.3 | hq 40 |
| PR, hN > 0.67 | PR, hN > 0.4 | hq 50 |
| PR, 0.1 < hN < 0.2 | PR, hN > 0.5 | hq 60 |
| PR, 0.2 < hN < 0.3 | PR, hN > 0.6 | hq 70 |
| PR, 0.3 < hN < 0.4 | PR, hN > 0.7 | hq 80 |
| PR, 0.4 < hN < 0.5 | PR, hN > 0.8 | hq 90 |
| PR, 0.5 < hN < 0.6 | PR, hN > 0.9 | Max |
| PR, 0.6 < hN < 0.7 | Skewness | Mean |
| PR, 0.7 < hN < 0.8 | Kurtosis | CV |
| PR, 0.8 < hN < 0.9 | hq 10 | |
| PR, hN > 0.1 | hq 20 | |
PR(hN) = Proportion of laser hits within a shown normalized height interval in a tree specimen; Skewness = Skewness of the height distribution of a tree specimen point cloud; Kurtosis = Kurtosis of the height distribution of a tree specimen point cloud; n:th hq = n:th height quantile in percents, from the base of the tree; Max = Maximum height of the laser hits in a tree specimen; Mean = Mean height of the laser hits in a tree specimen; CV = Coefficient of variation
The mean deciduous and coniferous tree separation results of the selected LiDAR-derived and hyperspectral feature pairs, and all of the feature quadruples thus formed. The number of the LiDAR-derived feature pairs was 73, the number of hyperspectral feature pairs was 754, and the number of selected feature quadruples was 55,042.
| 92.3 | 2.7 | 93.9 | 2.0 | 94.1 | 1.8 | |
| 75.0 | 5.8 | 68.9 | 4.8 | 83.6 | 3.6 | |
| --- | --- | --- | --- | --- | --- | --- |
| 86.9 | 3.6 | 86.2 | 2.9 | 90.9 | 2.4 | |
The average species classification results of the selected LiDAR-derived and hyperspectral feature pairs, and all of the feature quadruples thus formed. The species indexing is given in Table 1. The number of the LiDAR-derived feature pairs was 73, the number of hyperspectral feature pairs was 786, and the number of selected feature quadruples was 48,732.
| 22.6 | 20.7 | 33.6 | 22.8 | 77.7 | 12.6 | |
| 0.0 | 0.0 | 48.2 | 6.8 | 53.0 | 11.0 | |
| 39.5 | 11.3 | 60.4 | 12.1 | 57.2 | 9.5 | |
| 12.9 | 13.7 | 25.6 | 18.9 | 14.8 | 17.5 | |
| 83.9 | 6.1 | 37.0 | 14.5 | 84.3 | 7.0 | |
| 83.6 | 4.3 | 93.3 | 6.7 | 98.3 | 3.4 | |
| 45.2 | 33.4 | 28.8 | 19.3 | 34.1 | 29.0 | |
| 61.3 | 13.5 | 77.2 | 15.7 | 70.2 | 16.5 | |
| 16.1 | 16.5 | 0.0 | 0.7 | 17.3 | 15.6 | |
| 93.2 | 4.9 | 72.0 | 6.4 | 95.1 | 3.3 | |
| --- | --- | --- | --- | --- | --- | --- |
| 61.0 | 9.1 | 56.7 | 11.2 | 72.2 | 9.0 | |
The error matrix of the species-wise classification result of the best selected feature quadruple (LiDAR-derived features were the 90% height quantile (hq90)andthe mean height of a single tree specimen, and hyperspectral features that were the channels centered at 428 nm and at 982 nm). Bolded numbers in the diagonal are the numbers of correctly classified tree specimens. All classification accuracies are given in percents.
| 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 71.4 | ||
| 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 71.4 | ||
| 0 | 1 | 3 | 1 | 0 | 0 | 0 | 0 | 0 | 75.0 | ||
| 0 | 0 | 3 | 0 | 0 | 0 | 0 | 0 | 0 | 62.5 | ||
| 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 95.7 | ||
| 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 100.0 | ||
| 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 75.0 | ||
| 0 | 0 | 0 | 0 | 0 | 0 | 2 | 1 | 0 | 72.7 | ||
| 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 50.0 | ||
| 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 2 | 87.9 | ||
| 83.3 | 62.5 | 71.4 | 55.6 | 95.7 | 100.0 | 60.0 | 100.0 | 20.0 | 96.7 | ||
Classification accuracy comparison between the results of a Linear Discriminant Analyser (LDA) and the LibSVM. The best result of each case is reported.
| --- | --- | |
| LiDAR-derived feature pair | 86.3 | 90.5 |
| Hyperspectral feature pair | 81.0 | 90.5 |
| Fused feature quadruple | 94.1 | 95.8 |
| --- | --- | |
| LiDAR-derived feature pair | 54.9 | 65.4 |
| Hyperspectral feature pair | 54.1 | 62.4 |
| Fused feature quadruple | 79.7 | 83.5 |
Figure 5.Classification accuracy comparison between different four-feature classification parameter sets. Bars A and B were obtained from the feature quadruples formed in Section 4.3. Bar A shows the best classification result of the feature quadruples while bar B represents their average classification result and its standard deviation (see Table 4). Bar C is the overall classification result obtained with four forward-selected hyperspectral features. Bar D is the overall classification result of four forward-selected shape features. Bar E is the best overall classification result obtained with two forward-selected LiDAR-derived shape features and two hyperspectral features.