| Literature DB >> 34960474 |
Hui Yu1, Chuang Chen2, Ningyun Lu2, Cunsong Wang3.
Abstract
Prognostics and health management (PHM) with failure prognosis and maintenance decision-making as the core is an advanced technology to improve the safety, reliability, and operational economy of engineering systems. However, studies of failure prognosis and maintenance decision-making have been conducted separately over the past years. Key challenges remain open when the joint problem is considered. The aim of this paper is to develop an integrated strategy for dynamic predictive maintenance scheduling (DPMS) based on a deep auto-encoder and deep forest-assisted failure prognosis method. The proposed DPMS method involves a complete process from performing failure prognosis to making maintenance decisions. The first step is to extract representative features reflecting system degradation from raw sensor data by using a deep auto-encoder. Then, the features are fed into the deep forest to compute the failure probabilities in moving time horizons. Finally, an optimal maintenance-related decision is made through quickly evaluating the costs of different decisions with the failure probabilities. Verification was accomplished using NASA's open datasets of aircraft engines, and the experimental results show that the proposed DPMS method outperforms several state-of-the-art methods, which can benefit precise maintenance decisions and reduce maintenance costs.Entities:
Keywords: deep auto-encoder; deep forest; failure prognosis; maintenance cost; maintenance decision-making
Mesh:
Year: 2021 PMID: 34960474 PMCID: PMC8706898 DOI: 10.3390/s21248373
Source DB: PubMed Journal: Sensors (Basel) ISSN: 1424-8220 Impact factor: 3.576
Figure 1Proposed predictive maintenance framework.
Figure 2Deep auto-encoder architecture.
Figure 3Structure diagram of deep forest prognosis model.
Figure 4Structure diagram of class vector generation in deep forest.
Figure 5Structure diagram of the simulated engine [33].
Description of 21 sensor variables [33].
| ID | Symbol | Description | Units |
|---|---|---|---|
| 1 | T2 | Total temperature at fan inlet | ºR |
| 2 | T24 | Total temperature at LPC outlet | ºR |
| 3 | T30 | Total temperature at HPC outlet | ºR |
| 4 | T50 | Total temperature at LPT outlet | ºR |
| 5 | P2 | Pressure at fan inlet | psia |
| 6 | P15 | Total pressure in bypass-duct | psia |
| 7 | P30 | Total pressure at HPC outlet | psia |
| 8 | Nf | Physical fan speed | rpm |
| 9 | Nc | Physical core speed | rpm |
| 10 | epr | Engine pressure ratio (P50/P2) | -- |
| 11 | Ps30 | Static pressure at HPC outlet | psia |
| 12 | phi | Ratio of fuel flow to Ps30 | pps/psi |
| 13 | NRf | Corrected fan speed | rpm |
| 14 | NRc | Corrected core speed | rpm |
| 15 | BPR | Bypass ratio | -- |
| 16 | farB | Burner fuel–air ratio | -- |
| 17 | htBleed | Bleed enthalpy | -- |
| 18 | Nf_dmd | Demanded fan speed | rpm |
| 19 | PCNfR_dmd | Demanded corrected fan speed | rpm |
| 20 | W31 | HPT coolant bleed | lbm/s |
| 21 | W32 | LPT coolant bleed | lbm/s |
Sample run-to-failure data from an engine case.
| Operating Cycle | Sensor #1 (ºR) | Sensor #2 (ºR) | Sensor #3 (ºR) |
| Sensor #21 (lbm·s−1) |
|---|---|---|---|---|---|
| 1 | 518.67 | 641.82 | 1589.70 |
| 23.42 |
| 2 | 518.67 | 642.15 | 1591.82 |
| 23.42 |
| 3 | 518.67 | 642.35 | 1587.99 |
| 23.34 |
|
|
|
|
|
|
|
| 192 | 518.67 | 643.54 | 1601.41 |
| 22.96 |
Prognostic accuracies of different feature dimensions in cross-validation set.
| Feature Dimension | Prognostic Accuracy (%) |
|---|---|
| 4 | 97.23 |
| 8 | 97.73 |
| 12 | 98.13 |
| 16 | 97.55 |
Figure 6Confusion matrices (%) of different prognostic models on the test engines: (a) LSTM network; (b) Bi-LSTM network; (c) deep forest; and (d) deep auto-encoder + deep forest.
Decision results of some test engines at different inspection moments.
| Running Cycle | Real RUL | Deg1 (%) | Deg2 (%) | Deg3 (%) | Order | Stock | Maintenance |
|---|---|---|---|---|---|---|---|
|
| |||||||
| 90 | 45 | 96.64 | 3.19 | 0.17 | 0 | 0 | 0 |
| 100 | 35 | 97.01 | 2.85 | 0.14 | 0 | 0 | 0 |
| 110 | 25 | 76.51 | 22.47 | 1.01 | 1 | 0 | 0 |
| 120 | 15 | 37.31 | 58.54 | 4.15 | 1 | 0 | 0 |
| 130 | 5 | 2.29 | 15.10 | 82.61 | 1 | 1 | 1 |
|
| |||||||
| 300 | 41 | 94.75 | 5.02 | 0.23 | 0 | 0 | 0 |
| 310 | 31 | 91.33 | 8.28 | 0.39 | 0 | 0 | 0 |
| 320 | 21 | 26.81 | 65.99 | 7.20 | 1 | 0 | 0 |
| 330 | 11 | 6.57 | 39.58 | 53.85 | 1 | 0 | 1 |
|
| |||||||
| 110 | 45 | 99.95 | 0.05 | 0.00 | 0 | 0 | 0 |
| 120 | 35 | 99.90 | 0.09 | 0.01 | 0 | 0 | 0 |
| 130 | 25 | 72.29 | 26.24 | 1.47 | 1 | 0 | 0 |
| 140 | 15 | 42.48 | 53.87 | 3.65 | 1 | 0 | 0 |
| 150 | 5 | 1.72 | 11.24 | 87.04 | 1 | 1 | 1 |
Figure 7Average maintenance cost rates using different failure prognosis methods for 20 test engines.