| Literature DB >> 29125535 |
Giuseppe Riccardo Leone1, Davide Moroni2, Gabriele Pieri3, Matteo Petracca4, Ovidio Salvetti5, Andrea Azzarà6, Francesco Marino7.
Abstract
Smart cities are demanding solutions for improved traffic efficiency, in order to guarantee optimal access to mobility resources available in urban areas. Intelligent video analytics deployed directly on board embedded sensors offers great opportunities to gather highly informative data about traffic and transport, allowing reconstruction of a real-time neat picture of urban mobility patterns. In this paper, we present a visual sensor network in which each node embeds computer vision logics for analyzing in real time urban traffic. The nodes in the network share their perceptions and build a global and comprehensive interpretation of the analyzed scenes in a cooperative and adaptive fashion. This is possible thanks to an especially designed Internet of Things (IoT) compliant middleware which encompasses in-network event composition as well as full support of Machine-2-Machine (M2M) communication mechanism. The potential of the proposed cooperative visual sensor network is shown with two sample applications in urban mobility connected to the estimation of vehicular flows and parking management. Besides providing detailed results of each key component of the proposed solution, the validity of the approach is demonstrated by extensive field tests that proved the suitability of the system in providing a scalable, adaptable and extensible data collection layer for managing and understanding mobility in smart cities.Entities:
Keywords: IoT middleware; embedded vision; intelligent transportation systems; internet of things; real time image processing; smart cities; visual sensor networks
Year: 2017 PMID: 29125535 PMCID: PMC5713638 DOI: 10.3390/s17112588
Source DB: PubMed Journal: Sensors (Basel) ISSN: 1424-8220 Impact factor: 3.576
Figure 1System architecture.
Figure 2(A) RoI for a set of parking lots are set up manually with the help of a graphic tool. Small rectangles on the driveway define the samples for asphalt detection (B) The output of the Canny edge detector (C) White regions represent areas where asphalt is detected (D) Current input image with augmented reality displaying the status (E) Background image (F) Frame differencing to detect major status changes.
Figure 3Flow chart of the traffic flow monitoring algorithm.
Figure 4Middleware architecture.
Figure 5T-Res interface.
Figure 6Example of parking lot analysis: background model at time t (a) and real-time output at time t (b).
Performance of parking lot monitoring of 5 separate devices.
| Monitored Slots | Total Frames | False Hit Events | Missed Events | Total FP | Total FN | Error Rate | |
|---|---|---|---|---|---|---|---|
| 23 | 5357 | 10 | 24 | 238 | 594 | ||
| 22 | 5145 | 8 | 22 | 285 | 693 | ||
| 17 | 5260 | 7 | 14 | 156 | 396 | ||
| 16 | 5225 | 6 | 8 | 222 | 269 | ||
| 15 | 5305 | 6 | 5 | 211 | 197 |
Comparison of related work.
| Reference | Error Rate (%) | Features |
|---|---|---|
| Wu et al., 2007 [ | ||
| Sastre et al., 2007 [ | ||
| Bong et al., 2008 [ | ||
| Hichihashi et al., 2009 [ | ||
| Huang and Wang 2010 [ | ||
| DeAlmeida et al., 2015 [ | ||
| Amato et al., 2016 [ | ||
| Proposed method |
Figure 7Traffic flow analysis: view from sensor test set-up and example of vehicles transit in the field of view of the sensor, which may cause occlusion to the upper lane.
Figure 8Traffic flow analysis: detected vechicle from sensor test set-up (a) and the same frame processed with the RoIs highlighted (b).
Classification performance of the traffic flow monitoring.
| Total | Lower Lane | Upper Lane | |
|---|---|---|---|
| 124 | 70 | 54 | |
| 69 (98.6%) | 49 (90.7%) | ||
| 3 (2.4%) | 1 (1.4%) | 2 (4%) |
Example of classification with respect to length (ℓ). Analysis performed for lower lane data.
| Length Class | Ground Truth | Correct Class. | False Positive | Efficiency |
|---|---|---|---|---|
| 8 | 8 | 0 | ||
| 57 | 56 | 1 | ||
| 5 | 5 | 0 | ||
| 70 | 69 | 1 |
Comparison of performances and computational power for different algorithms.
| Performance | Hardware/Processing Notes | |
|---|---|---|
| 600 | ||
| 454 |
Figure 9Experimental setup in the laboratory testbed.
Event notification latency.
| Message Size [Bytes] | Number of Messages | Sensor to GSCL [ms] |
|---|---|---|
| 104 | 2 | 176.01 ± 0.27 |
| 155 | 3 | 227.79 ± 0.42 |
| 228 | 5 | 287.06 ± 0.38 |
Figure 10Picture showing the final field test installation.
Cooperative monitoring results.
| CAM A | CAM B | COOP. | |
|---|---|---|---|
| ERRORS | 4870 | 3674 | 804 |
| % |
Figure 11Percentage of parking occupancy on December 14.
Figure 12Aggregated vehicles per hour data by time slots for the latter 3 weeks (reduced only for a better readability) of December 2015.