| Literature DB >> 22389586 |
Tim De Roeck1, Tim Van de Voorde, Frank Canters.
Abstract
Since 2008 more than half of the world population is living in cities and urban sprawl is continuing. Because of these developments, the mapping and monitoring of urban environments and their surroundings is becoming increasingly important. In this study two object-oriented approaches for high-resolution mapping of sealed surfaces are compared: a standard non-hierarchic approach and a full hierarchic approach using both multi-layer perceptrons and decision trees as learning algorithms. Both methods outperform the standard nearest neighbour classifier, which is used as a benchmark scenario. For the multi-layer perceptron approach, applying a hierarchic classification strategy substantially increases the accuracy of the classification. For the decision tree approach a one-against-all hierarchic classification strategy does not lead to an improvement of classification accuracy compared to the standard all-against-all approach. Best results are obtained with the hierarchic multi-layer perceptron classification strategy, producing a kappa value of 0.77. A simple shadow reclassification procedure based on characteristics of neighbouring objects further increases the kappa value to 0.84.Entities:
Keywords: Urban mapping; decision trees; hierarchic classification; multiple layer perceptron; sealed surfaces
Year: 2009 PMID: 22389586 PMCID: PMC3280732 DOI: 10.3390/s90100022
Source DB: PubMed Journal: Sensors (Basel) ISSN: 1424-8220 Impact factor: 3.576
Figure 1.False colour infrared image of the study area (bottom) showing the spatial extent of the three ground truth sites (top).
Total number of validation pixels for each target class, relative weight of each class in the validation set, and colour used for representing each class in the maps.
| 74,868 | 436,972 | 195,490 | 47,842 | 440,182 | 68,436 | 1,263,790 | |
Figure 2.Segmentation of non-red surfaces for a subset of the study area (step two of the segmentation process).
Overview of the 29 spectral and textural variables used as input for the variable selection in each step of the DT and MLP classification scenarios.
| average of the values in the spectral band, taken over all pixels within the segment | |
| ratio between the mean of the spectral band and the sum of the mean values in every spectral band within the segment | |
| standard deviation of all pixel values in the spectral band within the segment | |
| angular second moment of the grey-level co-occurrence matrix: reflects the degree of homogeneity present in the spectral band within the segment | |
| contrast of the grey-level co-occurrence matrix: reflects the contrasts present in the spectral band within the segment | |
| entropy of the grey-level co-occurrence matrix: reflects the randomness in the spatial arrangement of spectral band values within the segment | |
| average of the green band divided by average of the blue band | |
| average of the red band divided by average of the blue band | |
| average of the red band divided by average of the green band | |
| sum of the mean values in every spectral band | |
| (nir - red) / (nir + red) |
Figure 3.Classification strategies: two-step classification strategy (left) and full hierarchic classification strategy (right).
Selected input variables for each phase in the two-step and the full hierarchic classification strategy for DT and MLP classification. In the two-step strategy, ‘red’ stands for the red surfaces class and ‘rest’ for the other classes, which are classified in one step in this approach.
| mean nir | mean green | |
| mean blue | ratio blue | |
Per class user's accuracies, PCC and kappa values for the five classification scenarios.
| 0.79 | 0.89 | 0.76 | 0.89 | 0.76 | ||
| 0.95 | 0.93 | 0.93 | 0.94 | 0.93 | ||
| 0.97 | 0.99 | 0.97 | 0.99 | 0.98 | ||
| 0.46 | 0.60 | 0.65 | 0.70 | 0.83 | ||
| 0.82 | 0.86 | 0.86 | 0.84 | 0.88 | ||
| 0.19 | 0.27 | 0.25 | 0.26 | 0.30 | ||
| 76.2% | 80.4% | 78.2% | 79.3% | 82.5% | ||
| 0.69 | 0.74 | 0.71 | 0.73 | 0.77 | ||
Figure 4.Full hierarchic multi-layer perceptron classification results: original classification (left) and result after post-classification shadow re-assignment (right).
Confusion matrix for the full hierarchic multi-layer perceptron classification result before shadow reclassification.
| 221 | 464 | 25 | 17985 | 1892 | 85350 | 0.76 | |||
| 282 | 5458 | 2167 | 11834 | 5951 | 388134 | 0.93 | |||
| 0 | 2539 | 0 | 323 | 615 | 181372 | 0.98 | |||
| 1375 | 2305 | 757 | 4482 | 209 | 52966 | 0.83 | |||
| 4386 | 31051 | 1672 | 736 | 10114 | 391939 | 0.88 | |||
| 4062 | 38414 | 9244 | 1076 | 61578 | 164029 | 0.30 | |||
| 74868 | 436972 | 195490 | 47842 | 440182 | 68436 | 1263790 | |||
| 0.87 | 0.83 | 0.91 | 0.92 | 0.78 | 0.73 | PCC: 82.5% | |||
Confusion matrix for the full hierarchic multi-layer perceptron classification result after shadow reclassification.
| 228 | 482 | 26 | 20343 | 85127 | 0.75 | |||
| 1520 | 7198 | 9655 | 23470 | 434158 | 0.9 | |||
| 2 | 4388 | 1018 | 1287 | 191655 | 0.97 | |||
| 1412 | 7597 | 1009 | 11826 | 73280 | 0.7 | |||
| 7886 | 36571 | 1841 | 7748 | 479570 | 0.89 | |||
| 74868 | 441099 | 195490 | 69883 | 482450 | 1263790 | |||
| 0.86 | 0.89 | 0.95 | 0.74 | 0.88 | PCC: 88.5% | |||