| Literature DB >> 35610247 |
Alessandro Riboni1, Nicolò Ghioldi1, Antonio Candelieri1, Matteo Borrotti2,3.
Abstract
Automated driving systems (ADS) have undergone a significant improvement in the last years. ADS and more precisely self-driving cars technologies will change the way we perceive and know the world of transportation systems in terms of user experience, mode choices and business models. The emerging field of Deep Learning (DL) has been successfully applied for the development of innovative ADS solutions. However, the attempt to single out the best deep neural network architecture and tuning its hyperparameters are all expensive processes, both in terms of time and computational resources. In this work, Bayesian optimization (BO) is used to optimize the hyperparameters of a Spatiotemporal-Long Short Term Memory (ST-LSTM) network with the aim to obtain an accurate model for the prediction of the steering angle in a ADS. BO was able to identify, within a limited number of trials, a model-namely BO_ST-LSTM-which resulted, on a public dataset, the most accurate when compared to classical end-to-end driving models.Entities:
Mesh:
Year: 2022 PMID: 35610247 PMCID: PMC9130256 DOI: 10.1038/s41598-022-12509-6
Source DB: PubMed Journal: Sci Rep ISSN: 2045-2322 Impact factor: 4.996
Figure 1Overview of the BO-STLSTM system. Images in the Input: camera images box come from the SullyChen Dataset[43].
Figure 2Images from the SullyChen Dataset[43]: (a) straight road with trees and shadow, (b) road with left curve with obstacle (car), (c) road with right curve (d) road with a gradual curve to the left with traffic island.
Setting for all networks.
| PilotNet | J-Net | ST-LSTM |
|---|---|---|
| Normalization layer | Normalization layer | Normalization layer |
| Conv2D | Conv2D | ConvLSTM2D |
| Conv2D | MaxPooling2D | BatchNormalization |
| Conv2D | Conv2D | ConvLSTM2D |
| Conv2D | MaxPooling2D | BatchNormalization |
| Conv2D | Conv2D | ConvLSTM2D |
| Flatten | MaxPooling2D | BatchNormalization |
| Dense | Flatten | ConvLSTM2D |
| Dense | Dense | BatchNormalization |
| Dense | Conv3D | |
| MaxPooling3D | ||
| Flatten | ||
| Dense | ||
| Output value | Output value | Output value |
Parameters of the ST-LSTM architecture for Bayesian optimization.
| Parameters name | Domain space | Domain type |
|---|---|---|
| ConvLSTM | Discrete | |
| ConvLSTM | Discrete | |
| ConvLSTM | Discrete | |
| ConvLSTM | Discrete | |
| Conv3D: Num. feature maps | Discrete | |
| FC: Num. of neurons | Discrete | |
| Dropout | Continuous | |
| Learning rate (Adam optimizer) | Discrete |
Figure 3Comparison among GP-based BO processes using three different acquisition functions.
Figure 4Comparison between the distribution of the best seen obtained from the different acquisition function.
Figure 5Structure of BO_ST-LSTM architecture. Images used at the beginning of the structure come from the SullyChen Dataset[43].
Average values of the prediction performance indicators on the training and validation sets for the four architectures.
| PilotNet | J-Net | ST-LSTM | BO_ST-LSTM | ||
|---|---|---|---|---|---|
| Training | MSE | 0.0209 | 0.0405 | 0.1831 | |
| MAE | 0.0870 | 0.1181 | 0.1971 | ||
| St. AE | 0.1155 | 0.1630 | 0.3798 | ||
| Validation | MSE | 0.6814 | 0.5842 | 0.6139 | |
| MAE | 0.4409 | 0.4262 | 0.4710 | ||
| St. AE | 0.6979 | 0.6345 | 0.6263 |
In bold the best value for each indicator.
Bias-variance tradeoff decomposition.
| Training | Validation | |||||
|---|---|---|---|---|---|---|
| MSE | Bias | Variance | MSE | Bias | Variance | |
| PilotNet | 0.0209 | 0.0004 | 0.0205 | 0.6814 | 0.0350 | 0.6464 |
| J-Net | 0.0114 | 0.0002 | 0.0112 | 0.5842 | 0.0440 | 0.5402 |
| ST-LSTM | 0.0405 | 0.0001 | 0.0404 | 0.6139 | 0.0755 | 0.5384 |
| BO_ST-LSTM | 0.1831 | 0.0002 | 0.1829 | 0.5019 | 0.0130 | 0.4881 |
Average values of the prediction performance indicators on the training and test sets.
| PilotNet | J-Net | ST-LSTM | BO_ST-LSTM | ||
|---|---|---|---|---|---|
| Training | MSE | 0.2917 | 0.0159 | 0.0442 | 0.0204 |
| MAE | 0.2464 | 0.0793 | 0.1187 | 0.0888 | |
| St. AE | 0.4806 | 0.0982 | 0.1734 | 0.1117 | |
| Test | MSE | 0.2810 | 0.3204 | 0.2758 | 0.2700 |
| MAE | 0.3781 | 0.3918 | 0.3866 | 0.3848 | |
| St. AE | 0.3715 | 0.4086 | 0.3555 | 0.3492 |