| Literature DB >> 35957170 |
Rodayna Hmede1, Frédéric Chapelle1, Yuri Lapusta1.
Abstract
Shape memory materials are smart materials that stand out because of several remarkable properties, including their shape memory effect. Shape memory alloys (SMAs) are largely used members of this family and have been innovatively employed in various fields, such as sensors, actuators, robotics, aerospace, civil engineering, and medicine. Many conventional, unconventional, experimental, and numerical methods have been used to study the properties of SMAs, their models, and their different applications. These materials exhibit nonlinear behavior. This fact complicates the use of traditional methods, such as the finite element method, and increases the computing time necessary to adequately model their different possible shapes and usages. Therefore, a promising solution is to develop new methodological approaches based on artificial intelligence (AI) that aims at efficient computation time and accurate results. AI has recently demonstrated some success in efficiently modeling SMA features with machine- and deep-learning methods. Notably, artificial neural networks (ANNs), a subsection of deep learning, have been applied to characterize SMAs. The present review highlights the importance of AI in SMA modeling and introduces the deep connection between ANNs and SMAs in the medical, robotic, engineering, and automation fields. After summarizing the general characteristics of ANNs and SMAs, we analyze various ANN types used for modeling the properties of SMAs according to their shapes, e.g., a wire as an actuator, a wire with a spring bias, wire systems, magnetic and porous materials, bars and rings, and reinforced concrete beams. The description focuses on the techniques used for NN architectures and learning.Entities:
Keywords: SMA actuators; SMA properties; SMA sensors; SMA shapes; artificial neural network; shape memory alloy
Mesh:
Substances:
Year: 2022 PMID: 35957170 PMCID: PMC9370891 DOI: 10.3390/s22155610
Source DB: PubMed Journal: Sensors (Basel) ISSN: 1424-8220 Impact factor: 3.847
Figure 1(a) The one-way shape memory effect of SMA wire. (b) Superelastic effect at a constant temperature. Ms and Mf are the start and finish martensitic phase temperatures, respectively; As and Af are the start and finish austenitic phase temperatures, respectively.
Figure 2(a) Biological neuron architecture. (b) Artificial neural network architecture. (a) The labeled main parts of the neuron form the passage of the message from the input dendrites to the output synapses. (b) We labeled the main parts of an artificial neuron network for the usual three layers (input layer, hidden layer, and output layer).
Figure 3Network Functioning. Xi is the input parameter; wi is the weight; Σ is the transformation function, which can be the identity, b is the bias; and φ is the activation function.
Activation functions of the ANNs.
| Activation Function | Equation |
|---|---|
| Step function | |
| Linear function |
|
| Rectified linear (ReLU) |
|
| Hyperbolic tangent |
|
| Radial basis function |
|
Classification of SMA’s application concerning its properties.
| Type of Property | Description |
|---|---|
| Mechanical property | Roughness [ |
| Thermal property | Austenite-finish temperature [ |
| Chemical Property | Reactant particle size [ |
| Electrical property | Servo voltage [ |
| Dimensional Property | Length of the wire [ |
| Magnetic Property | [ |
Classification of studies as a function of SMA form and NN characteristics.
| SMA Form | Application Type | NN Type | Training Method |
|---|---|---|---|
|
| Position Actuator | NN multilayer | Levenberg–Marquardt (LM) algorithm [ |
| NN Estimator | Parameter epochs: 3000 [ | ||
| Magnetic Actuator | Takagi–Sugeno fuzzy | MBFA and GDA algorithms [ | |
| Rotatory-Manipulator: actuator | NN direct control with online learning | BP algorithm [ | |
| Rotatory Manipulator: self-sensor | Shallow NN | LM algorithm [ | |
| Linear Actuator with a spring bias | Nonlinear-Autoregressive Exogenous NARX NN | Jordan–Elman and Jordan–Plus–Elman algorithm [ | |
| Proportional–Integral–Differential GRPID NN | Backpropagation algorithm [ | ||
| Functional Link Artificial Intelligent Neural Network | Particle-swarm optimization [ | ||
| Self-sensing with a spring bias | Shallow ANN | Extended Kalman Filter [ | |
| Antagonistic System: Actuator | LSTM | [ | |
| Self-sensor | DNN | DNN has two LSTM | |
| Conventional Machining | General Regression | ||
| Forwarded | |||
| Conventional Machining | Multilayer normal feed | VIKOR FUZZY [ | |
|
| Medical | Multilayer normal feed | Batch Backpropagation [ |
| Earthquake Civil Damping Self-centering | Feedforward Backpropagation | Incremental Backpropagation [ | |
| Vibrational control | Quick Prop algorithm QP [ | ||
|
| Aircrafts | BPNN | Backpropagation algorithm [ |
|
| Civil Damping | Neuro_Fuzzy Model | [ |
|
| Aircrafts | BPNN | Genetic algorithm [ |
Definition of neural networks used for modelling SMAs.
| NN | Definition | Domain |
|---|---|---|
| Full feedforward NN | It treats the information only in one direction “forward” from the input nodes, through the hidden nodes (if any) to the output nodes without cycles or loops in the network [ | Clustering |
| Long short term memory NN | It has feedback connectors. Its unit consists of a cell, an input gate, an output gate and a forget gate. The gate is a threshold help NN distinguishing between using the identity connections over the stacked layers. | Prediction |
| Multilayer ormal feed NN | It is a full feedforward NN, but with multicomputational layers (multihidden layer). | Clustering |
| Nonlinear autoregressive exogenous NN (NARX) | It is a recurrent NN that has loop connections between the nodes. | Time Series |
| NN estimator | It is NN that is based on an estimator, which is a technique or method that calculates an accurate result that depends on actual observations. | Prediction |
| General regression (GRNN) | It has a radial-basis function layer and a linear layer [ | Regression |
| Proportional- integral-differential NN (PIDNN) | It is a dynamic feedforward network, a combination of neural networks with the PID control concept. | Controlling |
| Takagi–Sugeno fuzzy neural network (TSFNN) | It is a fuzzy system model that needs fewer inputs without the capability of handling online data [ | Clustering |
| Functional link artificial intelligent NN (FLANN) | It is a single-layer higher-order class of an ANN [ | Pattern Recognition |