| Literature DB >> 35684831 |
Yi Zhou1,2,3,4, Lulu Liu1,3,5,6, Haocheng Zhao1,2,3,4, Miguel López-Benítez4,7, Limin Yu2, Yutao Yue1,3,6.
Abstract
With recent developments, the performance of automotive radar has improved significantly. The next generation of 4D radar can achieve imaging capability in the form of high-resolution point clouds. In this context, we believe that the era of deep learning for radar perception has arrived. However, studies on radar deep learning are spread across different tasks, and a holistic overview is lacking. This review paper attempts to provide a big picture of the deep radar perception stack, including signal processing, datasets, labelling, data augmentation, and downstream tasks such as depth and velocity estimation, object detection, and sensor fusion. For these tasks, we focus on explaining how the network structure is adapted to radar domain knowledge. In particular, we summarise three overlooked challenges in deep radar perception, including multi-path effects, uncertainty problems, and adverse weather effects, and present some attempts to solve them.Entities:
Keywords: automotive radars; autonomous driving; deep learning; multi-sensor fusion; object detection; radar signal processing
Mesh:
Year: 2022 PMID: 35684831 PMCID: PMC9185239 DOI: 10.3390/s22114208
Source DB: PubMed Journal: Sensors (Basel) ISSN: 1424-8220 Impact factor: 3.847
Characteristics of typical radars and LiDARs.
| Conventional Radar 1 | 4D Radar | 16-Beam | 32-Beam | Solid State | |
|---|---|---|---|---|---|
| Max Range | f: 250 m, n: 70 m | 300 m | 100 m | 200 m | 200 m |
| FoV (H/V) | f: 20°, n: 120°/✗ | 120°/ 30° | 360°/30° | 360°/40° | 120° 25° |
| Ang Res (H/V) | f: 1.5°, n: 4°/✗ | 1°/1° | 0.1°/2° | 0.1°/0.3° | 0.2°/0.2° |
| Doppler Res | 0.1 m/s | 0.1 m/s | ✗ | ✗ | ✗ |
| Point Density | Low | Medium | High | High | High |
| All Weather | ✓ | ✓ | ✗ | ✗ | ✗ |
| Power | 5 W | 5 W | 8 W | 10 W | 15 W |
| Expected Cost | Low | Low | Medium | High | Medium |
1 A typical 77 GHz 4Tx-6Rx automotive radar, 2Tx-6Rx for far range, and 2Tx-6Rx for near range.
Figure 1Point clouds of a 4D radar and a 16-beam LiDAR from the Astyx dataset [2].
Figure 2Overview of deep radar perception framework.
Figure 3Radar Tx/Rx signals and the resulting range-Doppler map.
Figure 4MIMO radar principles. (a) Virtual array configuration of a 2Tx4Rx MIMO radar (b) In TDM mode, Tx1 and Tx2 transmit signals in turns. (c) In DDM mode, a Doppler shift is added to Tx2.
Equations for radar performance.
| Definition | Equation |
|---|---|
| Max Unambiguous Range |
|
| Max Unambiguous Velocity |
|
| Max Unambiguous Angle |
|
| Range Resolution |
|
| Velocity Resolution |
|
| Angular Resolution |
|
| 3 dB Beamwidth |
|
The meaning of parameters is consistent in this section. Refer to Table A1 (in Appendix A) for a quick check of the meaning.
Typical automotive radar parameters [17].
| Parameter | Range |
|---|---|
| Transit power (dBm) | 10–13 |
| TX/RX antenna gain (dBi) | 10–25 |
| Receiver noise figure (dB) | 10–20 |
| Target RCS (dBsm) | (−10)–20 |
| Receiver sensitivity (dBm) | (−120)–(−115) |
| Minimum SNR (dB) | 10–20 |
Radar datasets.
| Name | Year | Task | Radar Type | Data | Doppler | Range | Other Sensors | Scenarios | Weather | Annotations | Size |
|---|---|---|---|---|---|---|---|---|---|---|---|
|
| |||||||||||
| nuScenes [ | 2020 | DT | LR | PC | ✓ | SV | CLO | USH | ✓ | 3D, T | L |
| PixSet [ | 2021 | DT | LR | PC | ✓ | MR | CLO | USP | ✓ | 3D, T | M |
| RadarScenes [ | 2021 | DTS | HR | PC | ✓ | SV | CO | USHT | ✓ | P | L |
| Pointillism [ | 2020 | D | 2LR | PC | ✓ | MR | CL | U | ✓ | 3D | M |
| Zendar [ | 2020 | D | SAR | ADC, RD, PC | ✓ | MR | CLO | U | ✗ | P | S |
| Dense [ | 2020 | D | LR | PC | ✓ | LR | CLO | USHT | ✓ | 3D | L |
| RADIATE [ | 2020 | LDT | SP | RA | ✗ | SV | CLO | USHP | ✓ | 2D, T, P | M |
|
| |||||||||||
| CARRADA [ | 2020 | DTS | LR | RAD | ✓ | SR | C | R | ✗ | 2D, P | M |
| RADDet [ | 2021 | D | LR | RAD | ✓ | SR | C | US | ✗ | 2D | M |
| CRUW [ | 2021 | D | LR | RAD | ✓ | USR | C | USHP | ✗ | P | L |
| RaDICaL [ | 2021 | L | LR | ADC | ✓ | USR, SR | C | USHIP | ✗ | 2D | L |
| Ghent VRU [ | 2020 | DS | LR | RAD | ✓ | SR | CL | U | ✗ | M | M |
|
| |||||||||||
| Astyx [ | 2019 | D | HR | PC | ✓ | MR | CL | SH | ✗ | 3D | S |
| View-of-Delft [ | 2022 | DT | HR | PC | ✓ | SR | CLO | U | ✗ | 3D, T | S |
| RADIal [ | 2021 | DS | HR | ADC, RAD, PC | ✓ | MR | CLO | USH | ✗ | P | M |
| TJ4DRadSet [ | 2022 | DT | HR | PC | ✓ | LR | CLO | U | ✗ | 3D, T | M |
|
| |||||||||||
| Oxford [ | 2020 | L | SP | RA | ✗ | SV | CLO | U | ✓ | P | L |
| Mulran [ | 2020 | L | SP | RA | ✗ | SV | LO | US | ✗ | P | M |
| Boreas [ | 2022 | LD | SP | RA | ✗ | SV | CLO | S | ✓ | P | L |
| EU Long-term [ | 2020 | L | LR | PC | ✓ | LR | CLO | U | ✓ | P | M |
| Endeavour [ | 2021 | L | LR | PC | ✓ | 5LR | LO | S | ✗ | P | M |
| ColoRadar [ | 2021 | L | HR, LR | ADC, PC | ✓ | 2USR | LO | SIT | ✓ | P | M |
|
| |||||||||||
| PREVENTION [ | 2019 | DT | LR | PC | ✓ | 1LR, 2SR | CLO | UH | ✓ | 2D, T | L |
| SCORP [ | 2020 | S | LR | ADC, RAD | ✓ | USR | C | P | ✗ | M | S |
| Ghost [ | 2021 | DS | LR | PC | ✗ | LR | CLO | S | ✗ | P | M |
Task: D, T, L, and S stand for detection, tracking, localisation, and segmentation; Type: LR, HR, SP, and SAR stand for low-resolution, high-resolution, spinning, and SAR; Range: SV, LR, MR, SR, an USR stand for surrounding view, long-range (<250 m), middle-range (<180 m), short-range (<50 m), and ultra-short-range (<25 m); Other Sensors: C, C, L, and O stand for camera, RGBD camera, LiDAR, and odometry; Scenarios: U, S, H, P, T, R, and I stand for urban (city), suburban, highway, parking lot, tunnel, race track, and indoors; Size: L, M, and S stand for large, medium, and small;Weather stands for adverse weather; Label: 2D, 3D, T, P, P, P, and M stand for 2D bounding box, 3D bounding box, track ID, pointwise detection, object-level point, pose, and segmentation mask.
Figure 5Two types of radar calibration targets [64,65]. The front board is made of styrofoam. The red triangle is a radar corner reflector.
Figure 6Radar data augmentation techniques. The Doppler velocity measured by radar is a scalar, so local rotation of the radar detection will cause a misalignment between the Doppler velocity and the true velocity. Global translation and rotation are free from such misalignment. When augmenting the radar RA map, it is necessary to interpolate the background area and compensate the intensity of detection.
Figure 7Radar range measurements. Off-the-shelf radars return detections on a 2D radar plane. The detections are sparsely spread on objects due to specular reflection. Due to multi-path propagation, radar can see through occlusions, and meanwhile, this can cause some noisy detections.
Figure 8(a) Radar detection expansion techniques. (b) Extend radar detections in height. (c) Build a probabilistic map, where the dark/light blue indicates channel with high/low confidence threshold. (d) Apply a strict filtering according to the bounding box, where only detections corresponding to the frontal surface are retained.
Figure 9Radar motion model. (a) Linear motion model needs multiple detections for the object. (b) Curvilinear motion model requires either two radars to observe the same objects or the determination of the vehicle boundary and rear axle.
Figure 10Overview of radar detection frameworks: Blue boxes indicate classical radar detection modules. Orange boxes represent AI-based substitutions.
Figure 11Overview of radar and camera fusion frameworks. We classify the fusion frameworks into input fusion, ROI fusion, feature map fusion, and decision fusion. For ROI fusion, we further investigate two architectures: cascade fusion, which projects radar proposals to image view, and parallel fusion, which fuses radar ROIs and visual ROIs.
Figure 12Multi-path effect: The solid orange box is the real object. Dotted boxes are ghost objects caused by multi-path propagation.
Parameters used in the radar signal processing section.
| Parameter | Meaning | Parameter | Meaning |
|---|---|---|---|
|
| Light Speed (m/s |
| Number of Tx Antennas |
|
| Wavelength (m) |
| Number of Rx Antennas |
|
| Carrier Frequency (Hz) |
| Inter Antenna Spacing (m) |
|
| Sweep Bandwidth (dB) |
| Array Aperture (m) |
|
| Chirp Slope |
| Transmit Power (dBW) |
|
| Number of Chirps |
| Antenna Gain (dB) |
|
| Chirp Duration (s) |
| Minimum Detectable Power (dBw) |
|
| Frame Duration (s) |
| Radar Cross-Section (dBm |
|
| IF Bandwidth (dB) |
| Signal-to-Noise Ratio |