| Literature DB >> 30744206 |
Songpu Ai1, Antorweep Chakravorty2, Chunming Rong3.
Abstract
The progress of technology on energy and IoT fields has led to an increasingly complicated electric environment in low-voltage local microgrid, along with the extensions of electric vehicle, micro-generation, and local storage. It is required to establish a home energy management system (HEMS) to efficiently integrate and manage household energy micro-generation, consumption and storage, in order to realize decentralized local energy systems at the community level. Domestic power demand prediction is of great importance for establishing HEMS on realizing load balancing as well as other smart energy solutions with the support of IoT techniques. Artificial neural networks with various network types (e.g., DNN, LSTM/GRU based RNN) and other configurations are widely utilized on energy predictions. However, the selection of network configuration for each research is generally a case by case study achieved through empirical or enumerative approaches. Moreover, the commonly utilized network initialization methods assign parameter values based on random numbers, which cause diversity on model performance, including learning efficiency, forecast accuracy, etc. In this paper, an evolutionary ensemble neural network pool (EENNP) method is proposed to achieve a population of well-performing networks with proper combinations of configuration and initialization automatically. In the experimental study, power demand predictions of multiple households are explored in three application scenarios: optimizing potential network configuration set, forecasting single household power demand, and refilling missing data. The impacts of evolutionary parameters on model performance are investigated. The experimental results illustrate that the proposed method achieves better solutions on the considered scenarios. The optimized potential network configuration set using EENNP achieves a similar result to manual optimization. The results of household demand prediction and missing data refilling perform better than the naïve and simple predictors.Entities:
Keywords: HEMS; artificial neural network; demand prediction; ensemble learning; evolutionary algorithm; gated recurrent unit; long short-term memory; machine learning; missing data; smart sensor
Year: 2019 PMID: 30744206 PMCID: PMC6387375 DOI: 10.3390/s19030721
Source DB: PubMed Journal: Sensors (Basel) ISSN: 1424-8220 Impact factor: 3.576
Figure 1The architecture of special units in RNN: (a) a LSTM unit; (b) a GRU.
Figure 2An overview of the proposed method.
Figure 3Workflow of the evolutionary ensemble method.
Figure 4The distribution of the interval between adjacent rows ().
A rough interval distribution of the rows with missing data.
| Interval (s) | Proportion (%) |
|---|---|
| [15,60) | 45.8 |
| [60,120) | 4.2 |
| [120,180) | 44.8 |
| >180 | 5.2 |
Figure 5Power demand of a day.
Figure 6The performance of 500 random initialized networks which are randomly generated using the potential network configuration set: (a) Distribution of training stopping epochs; (b) Distribution of performance of the networks.
Number of layers of the survival network.
| Network Type | 1-Layer | 2-Layer | 3-Layer | 4-Layer | 5-Layer | 6-Layer | 7-Layer | 8-Layer | 9-Layer | 10-Layer |
|---|---|---|---|---|---|---|---|---|---|---|
| DNN | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| LSTM based RNN | 3 | 3 | 4 | 2 | 0 | 0 | 0 | 0 | 0 | 0 |
| GRU based RNN | 8 | 4 | 1 | 4 | 2 | 0 | 0 | 0 | 0 | 0 |
Figure 7The performance of EENNP methods using different potential network configuration set: (a) The performance of two runs of EENNPs with the optimized potential network configuration set and two networks within the final survival population; (b) The performance of four runs of EENNPs in which two uses the optimized potential network configuration set and the other two uses the unoptimized set; (c) The comparation between the four EENNPs in detail.
Figure 8Performance of EENNP on different households.
Figure 9The impact of evolutionary parameters on model performance.