| Literature DB >> 35684883 |
Mostafa Elhashash1,2, Hessah Albanwan1,3, Rongjun Qin1,2,3,4.
Abstract
The evolution of mobile mapping systems (MMSs) has gained more attention in the past few decades. MMSs have been widely used to provide valuable assets in different applications. This has been facilitated by the wide availability of low-cost sensors, advances in computational resources, the maturity of mapping algorithms, and the need for accurate and on-demand geographic information system (GIS) data and digital maps. Many MMSs combine hybrid sensors to provide a more informative, robust, and stable solution by complementing each other. In this paper, we presented a comprehensive review of the modern MMSs by focusing on: (1) the types of sensors and platforms, discussing their capabilities and limitations and providing a comprehensive overview of recent MMS technologies available in the market; (2) highlighting the general workflow to process MMS data; (3) identifying different use cases of mobile mapping technology by reviewing some of the common applications; and (4) presenting a discussion on the benefits and challenges and sharing our views on potential research directions.Entities:
Keywords: LiDAR; mobile mapping; positioning
Mesh:
Year: 2022 PMID: 35684883 PMCID: PMC9185250 DOI: 10.3390/s22114262
Source DB: PubMed Journal: Sensors (Basel) ISSN: 1424-8220 Impact factor: 3.847
Figure 1An example of an MMS: a vehicle-mounted mobile mapping platform consisting of different positioning and data collection sensors to generate an accurate georeferenced 3D map of the environment. Shown here are the main sensors of the Leica Pegasus: Two Ultimate as an example. Photo courtesy of Leica Geosystems [13].
Positioning sensors overview.
| Sensor | Description | Benefits | Limitations |
|---|---|---|---|
| GNSS | The signals from orbiting satellites are utilized by the GNSS receiver to compute the position, velocity, and elevation. Some examples include GPS, GLONASS, Galileo, and BeiDou. |
No/less accumulation of errors due to its dependence on external signals. Data collected under a global reference coordinate system (e.g., WGS84). |
Signal inaccessible in complex urban regions e.g., tall buildings, trees, tunnels, indoor environments, etc. Requires post-processing using DGPS and RTK-GPS to minimize errors from receiver’s noise, pseudo-range, carrier phase, doppler shifts, atmospheric delays, etc. |
| IMU | IMU is an egocentric sensor that records the relative position of the orientation and directional acceleration of the host platform. |
Capable of navigating in all environments, such as indoors, outdoors, tunnels, caves, etc. A necessary supplemental data source for urban environments where GPS is unstable. |
Requires consistent calibration and a reference to avoid drift from the true position. Limited to short-range navigation. |
| DMI | A supplementary positioning sensor measures the traveled distance of the platform, i.e., information derived from a speedometer. |
A supplemental sensor to provide additional data points to alleviate accumulation errors of IMU sensors. |
Requires calibration and provides only distance information (1 degree of freedom). |
Specifications of different LiDAR sensors.
| Company | Model | Range (m) | Range Accuracy (cm) | Number of Beams | Horizontal FoV (°) | Vertical FoV (°) | Horizontal Resolution (°) | Vertical Resolution (°) | Points Per Second | Refresh Rate (Hz) | |
|---|---|---|---|---|---|---|---|---|---|---|---|
|
| RIGEL | VQ-250 | 1.5–500 | 0.1 | — | 360 | — | — | — | 300,000 | — |
| VQ-450 | 1.5–800 | 0.8 | — | 360 | — | — | — | 550,000 | — | ||
| Trimble | MX50 laser scanner | 0.6–80 | 0.2 | — | 360 | — | — | — | 960,000 | — | |
| MX9 laser scanner | 1.2–420 | 0.5 | — | 360 | — | — | — | 1,000,000 | — | ||
| Velodyne | HDL-64E | 120 | ±2 | 64 | 360 | 26.9 | 0.08 to 0.35 | 0.4 | 1,300,000 | 5 to 20 | |
| HDL-32E | 100 | ±2 | 32 | 360 | 41.33 | 0.08 to 0.33 | 1.33 | 695,000 | 5 to 20 | ||
| Puck | 100 | ±3 | 16 | 360 | 30 | 0.1 to 0.4 | 2.0 | 300,000 | 5 to 20 | ||
| Puck LITE | 100 | ±3 | 16 | 360 | 30 | 0.1 to 0.4 | 2.0 | 300,000 | 5 to 20 | ||
| Puck Hi-Res | 100 | ±3 | 16 | 360 | 20 | 0.1 to 0.4 | 1.33 | 300,000 | 5 to 20 | ||
| Puck 32MR | 120 | ±3 | 32 | 360 | 40 | 0.1 to 0.4 | 0.33 (min) | 600,000 | 5 to 20 | ||
| Ultra Puck | 200 | ±3 | 32 | 360 | 40 | 0.1 to 0.4 | 0.33 (min) | 600,000 | 5 to 20 | ||
| Alpha Prime | 245 | ±3 | 128 | 360 | 40 | 0.1 to 0.4 | 0.11 (min) | 2,400,000 | 5 to 20 | ||
| Ouster | OS2-32 | 1 to 240 | ±2.5 to ±8 | 32 | 360 | 22.5 | 0.18 | 0.7 | 655,000 | 10, 20 | |
| OS2-64 | 1 to 240 | ±2.5 to ±8 | 64 | 360 | 22.5 | 0.18 | 0.36 | 1,311,000 | 10, 20 | ||
| OS2-128 | 1 to 240 | ±2.5 to ±8 | 128 | 360 | 22.5 | 0.18 | 0.18 | 2,621,000 | 10–20 | ||
| Hesai | PandarQT | 0.1 to 60 | ±3 | 64 | 360 | 104.2 | 0.6° | 1.45 | 384,000 | 10 | |
| PandarXT | 0.05 to 120 | ±1 | 32 | 360 | 31 | 0.09, 0.18, 0.36 | 1 | 640,000 | 5, 10, 20 | ||
| Oandar40M | 0.3 to 120 | ±5 to ±2 | 40 | 360 | 40 | 0.2, 0.4 | 1, 2, 3, 4, 5, 6 | 720,000 | 10, 20 | ||
| Oandar64 | 0.3 to 200 | ±5 to ±2 | 64 | 360 | 40 | 0.2, 0.4 | 1, 2, 3, 4, 5, 6 | 1,152,000 | 10, 20 | ||
| Pandar128E3X | 0.3 to 200 | ±8 to ±2 | 128 | 360 | 40 | 0.1, 0.2, 0.4 | 0.125, 0.5, 1 | 3,456,000 | 10, 20 | ||
|
| Luminar | IRIS | Up to 600 | — | 640 lines/s | 120 | 0–26 | 0.05 | 0.05 | 300 points/square degree | 1 to 30 |
| Innoviz | InnovizOne | 250 | — | — | 115 | 25 | 0.1 | 0.1 | — | 5 to 20 | |
| InnovizTwo | 300 | — | 8000 lines/s | 125 | 40 | 0.07 | 0.05 | — | 10 to 20 | ||
|
| LeddarTech | Pixell | Up to 56 | ±3 | — | 117.5 ± 2.5 | 16.0 ± 0.5 | — | — | — | 20 |
| Continental | HFL110 | 50 | — | — | 120 | 30 | — | — | — | 25 |
“—” indicates that the specifications were not mentioned in the product datasheet.
Camera sensor overview.
| Type | Description | Benefits | Limitations |
|---|---|---|---|
| Monocular | Single-lens camera. |
Low cost. Provides a series of single RGB images to collect high-resolution and geotagged images or panoramas. |
Cannot recover 3D scale without additional sensors. Camera networks suboptimal to generate highly accurate 3D points. |
| Binocular | Two collocated cameras with known relative orientation capturing overlapping and synchronized image |
Can provide depth and scale of objects the scene. Provides better accuracy integrated with LiDAR sensor. |
Performance and accuracy may depend on the algorithm used to compute the 3D information. |
| RGB-D | Cameras that capture RGB and depth images at the same time |
Simultaneous data acquisition. Provides high accuracy when integrated with LiDAR. |
Depth image sensitive to occlusions. Low range. The depth image may include some uncertainties and errors. |
| Multi-camera system | A spherical camera system with multiple cameras that can provide a 360° field of view |
Panoramic view showing the entire scene. Suitable for street mapping applications. |
Requires large storage to save images in real-time. Must be properly calibrated to assure alignment of images and minimum distortions. |
| Fisheye | Spherical lens camera that has more than 180° field of view |
Provides wide coverage of the scene allowing capture of the scene with fewer images. |
Lens distortions. Non-projective transformation. Requires rigorous calibration. |
Specifications of different MMSs.
| System | Release Year | Indoor | Outdoor | Camera | LiDAR/Max. Range | IMU | GPS | Accuracy * | Applications | |
|---|---|---|---|---|---|---|---|---|---|---|
| Vehicle-mounted | Leica Pegasus: Two Ultimate | 2018 | 🗶 | 🗸 | 360° FoV | ZF9012 profiler 360° × 41.33°/100 m | 🗸 | 🗸 | 2 cm horizontal accuracy |
Urban 3D modeling. Road asset management. Analyzing change detection Creating HD maps. Generating geolocated panoramic images. |
| Teledyne Optech Lynx HS600-D | 2017 | 🗶 | 🗸 | 360° FoV | 2 Optech sensors/130 m | 🗸 | 🗸 | ±5 cm absolute accuracy | ||
| Topcon IP-S3 HD1 | 2015 | 🗶 | 🗸 | 360° FoV | Velodyne HDL-32E LiDAR/100 m | 🗸 | 🗸 | 0.1 cm road surface accuracy (1 sigma) | ||
| Hi-Target HiScan-C | 2017 | 🗶 | 🗸 | 360° FoV | 650 m | 🗸 | 🗸 | 5 cm at 40 m range | ||
| Trimble MX7 | 🗶 | 🗸 | 360° FoV | 🗶 | 🗸 | 🗸 | — | |||
| Trimble MX50 | 2021 | 🗶 | 🗸 | 90% of a full sphere | 2 MX50 Laser scanner/80 m | 🗸 | 🗸 | 0.2 cm (laser scanner) | ||
| Trimble MX9 | 2018 | 🗶 | 🗸 | 1 spherical + 2 side looking + 1 backward/downward camera | MX9 Laser scanner/up to 420 m | 🗸 | 🗸 | 0.5 cm (laser scanner) | ||
| Viametris vMS3D | 2016 | 🗶 | 🗸 | FLIR Ladybug5+ | Velodyne VLP-16 + Velodyne HDL-32E | 🗸 | 🗸 | 2–3 cm relative accuracy | ||
| Handheld | HERON LITE Color | 2018 | 🗸 | 🗸 | 360° × 360° FoV | 1 Velodyne Puck/100 m | 🗸 | 🗶 | 3 cm relative accuracy |
Mapping enclosed and complex spaces and cultural heritage. Forest surveying. Building Information Modeling. |
| GeoSLAM Zeb Go | 2020 | 🗸 | 🗶 | Can be added, accessory | Hokuyo UTM-30LX laser scanner/30m | 🗶 | 🗶 | 1 to 3 cm relative accuracy | ||
| GeoSLAM Zeb Revo RT | 2015 | 🗸 | 🗶 | Can be added, accessory | Hokuyo UTM-30LX laser scanner/30m | 🗶 | 🗶 | 0.6 cm relative accuracy | ||
| GeoSLAM Zeb Horizon | 2018 | 🗸 | 🗸 | Can be added, accessory | Velodyne Puck VLP-16/100 m | 🗶 | 🗶 | 0.6 cm relative accuracy | ||
| Leica BLK2GO | 2018 | 🗸 | 🗸 | 3 camera system 300° × 150° FoV | Up to 25 m 360 × 270 | 🗶 | 🗶 | ±1 cm in an indoor environment with a scan duration of 2 min | ||
| Wearable | Leica Pegasus: Backpack | 2017 | 🗸 | 🗸 | 360° × 200° FoV | Dual Velodyne VLP-16/100 m | 🗸 | 🗸 | 2 to 3 cm relative accuracy | |
| HERON MS Twin | 2020 | 🗸 | 🗸 | 360° × 360° FoV | Dual Velodyne Puck/ | 🗸 | 🗶 | 3 cm relative accuracy | ||
| NavVis VLX | 2021 | 🗸 | 🗸 | 360° FoV | Dual Velodyne Puck LITE/100 m | 🗸 | 🗶 | 0.6 cm absolute accuracy at 68% confidence | ||
| Viametris BMS3D-HD | 2019 | 🗸 | 🗸 | FLIR Ladybug5+ | 16 beams LiDAR + 32 beams LiDAR | 🗸 | 🗸 | 2 cm relative accuracy | ||
| Trolley | NavVis M6 | 2018 | 🗸 | 🗶 | 360° FoV | 6 Velodyne Puck LITE/100 m | 🗸 | 🗶 | 0.57 cm absolute accuracy at 68% confidence |
Indoor mapping for government buildings, airports, and train stations. Tunnel inspection. Measuring asphalt roughness. Building Information Modeling. |
| Leica ProScan | 2017 | 🗸 | 🗸 | 🗶 | Leica ScanStation P40, P30 or P16 | 🗸 | 🗸 | 0.12 cm (range accuracy for Lecia ScanStation P40) | ||
| Trimble Indoor | 2015 | 🗸 | 🗶 | 360° FoV | Trimble TX-5, FARO Focus X-130, X-330, S-70-A, S-150-A, S-350-A | 🗸 | 🗶 | 1 cm relative accuracy when combined with FARO Focus X-130 | ||
| FARO Focus Swift | 2020 | 🗸 | 🗶 | HDR camera | FARO Focus Laser Scanner with a FARO ScanPlan 2D mapper | 🗸 | 🗶 | 0.2 cm relative accuracy at 10 m range |
* The accuracy measurement reported by the manufacturers. The measure of the accuracy is unknown if not stated as relative or absolute. The “—” symbol indicates that the specifications were not mentioned in the product datasheet.
Figure 2Leica Pegasus: Two Ultimate vehicle-mounted system. Photo courtesy of Leica Geosystems [13].
Figure 3Handheld and wearable systems: (a) HERON LITE Color, (b) HERON MS Twin. Photos courtesy of Gexcel srl [87].
Figure 4Leica ProScan trolley-based MMSs. Photo courtesy of Leica Geosystems [13].
Figure 5The standard processing pipeline for MMS data.
An overview of the selected mobile mapping applications.
| Selected Applications | Highlights | |
|---|---|---|
| Road asset management and condition assessment | Extraction of road assets [ |
Vehicle-mounted system regularly operating on the road. More efficient than manual inspection. Leveraging deep learning to facilitate the inspection process. |
| BIM | Low-cost MMS for BIM of archeological reconstruction [ |
Data are collected with portable systems. Useful for maintenance and renovation planning. Rich database for better information management. |
| Emergency and disaster response | Network-based GIS for disaster response [ |
Timely and accurate disaster response. Facilitates the decision-making process. Effective training and simulations. |
| Vegetation mapping and detection | Mapping and monitoring riverine vegetation [ |
Accurate and automatic measurements. Reduces occlusions for 3D urban models. |
| Digital Heritage Conservation | Mapping a complex heritage site using handheld MMS [ |
Utilizes the flexibility of portable platforms. Enables virtual tourism. Digital recording of cultural sites. |