Enhancing Diesel Backup Power Forecasting With LSTM, GRU, and Autoencoder-based Input Encoding

Authors

DOI:

https://doi.org/10.23887/janapati.v14i1.92079

Keywords:

Diesel Backup Power Forecasting, Deep Learning, Autoencoder, Bayesian Optimization, Long Short Term Memory, gated recurrent unit

Abstract

Ensuring a reliable electricity supply is crucial for Indonesia's development. This study applies deep learning to forecast diesel backup power output. One challenge in such predictions is balancing the input sequence length and the number of features to avoid overly long input sequences, which may degrade model performance. To address this, we utilized an autoencoder to compress the input sequence, improving prediction accuracy. Additionally, given the time-consuming nature of hyper-parameter optimization in deep learning, we employed Bayesian optimization to streamline the process and achieve optimal hyper-parameter settings.The study compares a General Regression Neural Network (GRNN) optimized by FOA with Long Short-Term Memory (LSTM) and Gated Recurrent Unit (GRU) models optimized by Gaussian Process (GP). Results show that LSTM and GRU with encoded inputs outperform their non-encoded counterparts. The GRU, combined with an autoencoder and Bayesian-optimized hyper-parameters, achieves the lowest prediction error, demonstrating superior forecasting capability.The dataset, obtained from evaluated feeders in Kapuas District, Central Kalimantan, covers hourly power generation and distribution from October 2017 to September 2018. Data was split into 11 months for training and 1 month for testing, with the training set further divided into 70% training and 30% validation. The best performing model achieved RMSE and MAE values of 27.5824 and 14.9804, respectively. Future research may explore further optimization, feature selection techniques, and extended dataset variations.

Author Biographies

Prof. Leu Yungho, National Taiwan University of Science and Technology

Department of Information Management, School of Management, National Taiwan University of Science and Technology

Mr. Khabib Mustofa, Universitas Gadjah Mada

Department of Computer Science and Electronics, Faculty of Mathematics and Natural Science, Universitas Gadjah Mada

Mr. Mardhani Riasetiawan, Universitas Gadjah Mada

Department of Computer Science and Electronics, Faculty of Mathematics and Natural Science, Universitas Gadjah Mada

References

A. Gensler, J. Henze, B. SickRaabe, and N. Raabe, “Deep Learning for Solar Power Forecasting – An Approach Using Autoencoder and LSTM Neural Networks,” in 2016 IEEE International Conference on Systems, Man, and Cybernetics (SMC 2016), 2016, pp. 2858–2865.

B. Santoso, W. Anggraeni, H. Pariaman, and M. H. Purnomo, “RNN-Autoencoder Approach for Anomaly Detection in Power Plant Predictive Maintenance Systems,” Int. J. Intell. Eng. Syst., vol. 15, no. 4, pp. 363–381, 2022, doi: 10.22266/ijies2022.0831.33.

S. Grubwinkler and M. Lienkamp, “Energy Prediction for EVs Using Support Vector Regression Methods Stefan,” in 7th IEEE International Conference Intelligent Systems IS’2014, 2015, vol. 322, pp. 769–780, doi: 10.1007/978-3-319-11313-5.

D. Kryukov, M. Agafonova, and A. Arestova, “Comparison of Regression and Neural Network Approaches to Forecast Daily Power Consumption Kryukov,” in IFOST-2016: Power Engineering and Renewable Energy Technologies Comparison, 2016, no. 4, pp. 247–250.

J. Gonzalvez, E. Lezmi, T. Roncalli, and J. Xu, “Financial Applications of Gaussian Processes and Bayesian Optimization,” pp. 1–42, 2019, [Online]. Available: https://dx.doi.org/10.2139/ssrn.3344332.

Y. Fu, Z. Li, H. Zhang, and P. Xu, “Using Support Vector Machine to Predict Next Day Electricity Load of Public Buildings with Sub-metering Devices,” 9th Int. Symp. Heating, Vent. Air Cond. 3rd Int. Conf. Build. Energy Environ., vol. 121, pp. 1016–1022, 2015, doi: 10.1016/j.proeng.2015.09.097.

M. Syafruddin, L. Hakim, and D. Despa, “Linear Regression Method for Predicting Long-Term Electric Energy Needs (Case Study of Lampung Province),” J. Inform. dan Tek. Elektro, no. 1, 2014, [Online]. Available: http://journal.eng.unila.ac.id/index.php/jitet/article/download/237/228.

O. Gamze, D. Omer F, and Z. Selim, “Forecasting Electricity Consumption with Neural Networks and Support Vector Regression,” in 8th International Strategic Management Conference Forecasting, 2012, vol. 58, pp. 1576–1585, doi: 10.1016/j.sbspro.2012.09.1144.

S. Hochreiter and J. Schmidhuber, “Long Short-Term Memory,” Neural Comput., vol. 9, no. 8, pp. 1–32, 1997.

K. Greff, R. K. Srivastava, J. Koutn’ik, B. R. Steunebrink, and J. Schmidhuber, “LSTM: Search Space Odyssey,” IEEE Trans. NEURAL NETWORKS Learn. Syst., vol. 28, no. 10, pp. 2222–2232, 2017, [Online]. Available: http://arxiv.org/abs/1503.04069.

M. L. Ashari and M. Sadikin, “PREDIKSI DATA TRANSAKSI PENJUALAN TIME SERIES MENGGUNAKAN REGRESI LSTM,” J. Nas. Pendidik. Tek. Inform. JANAPATI, vol. 9, no. 1 SE-Articles, pp. 1–10, Apr. 2020, doi: 10.23887/janapati.v9i1.19140.

L. Wiranda and M. Sadikin, “PENERAPAN LONG SHORT TERM MEMORY PADA DATA TIME SERIES UNTUK MEMPREDIKSI PENJUALAN PRODUK PT. METISKA FARMA,” J. Nas. Pendidik. Tek. Inform. JANAPATI, vol. 8, no. 3 SE-Articles, pp. 184–196, Jan. 2020, doi: 10.23887/janapati.v8i3.19139.

I. N. Yulita, Afrida Helen, and Mira Suryani, “Machine Learning Prediction of Time Series Covid-19 Data in West Java, Indonesia,” J. Nas. Pendidik. Tek. Inform., vol. 12, no. 2, pp. 174–183, 2023, doi: 10.23887/janapati.v12i2.58505.

V. F. Hakim and D. Riana, “Analysis of User Complaints for Telecommunication Brands on X (Twitter) using IndoBERT and Deep Learning,” J. Nas. Pendidik. Tek. Inform., vol. 13, no. 2, pp. 270–279, 2024, doi: 10.23887/janapati.v13i2.76497.

Q. Fournier and D. Aloise, “Empirical comparison between autoencoders and traditional dimensionality reduction methods,” Proc. - IEEE 2nd Int. Conf. Artif. Intell. Knowl. Eng. AIKE 2019, pp. 211–214, 2019, doi: 10.1109/AIKE.2019.00044.

Y. Yaxin, Liang Xinshi, Li Xin, Huang Ziqi, Zhang Yue, “An Automated Data Mining Framework Using Autoencoders for Feature Extraction and Dimensionality Reduction.”

W. Yu, I. Y. Kim, and C. Mechefske, “Analysis of different RNN autoencoder variants for time series classification and machine prognostics,” Mech. Syst. Signal Process., vol. 149, p. 107322, 2021, doi: https://doi.org/10.1016/j.ymssp.2020.107322.

S. Demir, K. Mincev, K. Kok, and N. G. Paterakis, “Data augmentation for time series regression: Applying transformations, autoencoders and adversarial networks to electricity price forecasting,” Appl. Energy, vol. 304, p. 117695, 2021, doi: https://doi.org/10.1016/j.apenergy.2021.117695.

N. Srivastava, E. Mansimov, and R. Salakhutdinov, “Unsupervised Learning of Video Representations using LSTMs,” Feb. 2015, [Online]. Available: http://arxiv.org/abs/1502.04681.

J. Brownlee, “A Gentle Introduction to LSTM Autoencoders,” 2018. https://machinelearningmastery.com/lstm-autoencoders/.

E. Brochu, V. M. Cora, and N. De Freitas, “A Tutorial on Bayesian Optimization of Expensive Cost Functions, with Application to Active User Modelling and Hierachical Reinforcement Learning,” 2010.

P. I. Frazier, “A Tutorial on Bayesian Optimization,” pp. 1–22, 2018, [Online]. Available: http://arxiv.org/abs/1807.02811.

Python, “Skopt module.” https://scikit-optimize.github.io/.

N. P. N. P. Dewi and R. A. Nugroho, “Optimasi General Regression Neural Network Dengan Fruit Fly Optimization Algorithm Untuk Prediksi Pemakaian Arus Listrik Pada Penyulang,” KOMPUTASI J. Ilm. Ilmu Komput. dan Mat., vol. 18, no. 1, pp. 1–12, 2021, doi: https://doi.org/10.33751/komputasi.v18i1.2144.

Keras, “Keras: The Python Deep Learning library.” https://keras.io/.

Neupy, “Neupy-Neural Networks in Python.” http://neupy.com/apidocs/neupy.algorithms.rbfn.grnn.html#neupy.algorithms.rbfn.grnn.GRNN.

D. P. Kingma and J. L. Ba, “Adam: A Method for Stochastic Optimization,” in International Conference on Learning Representations (ICLR), 2015, pp. 1–15, [Online]. Available: http://arxiv.org/abs/1412.6980.

M. Abumohsen, A. Y. Owda, and M. Owda, “Electrical Load Forecasting Using LSTM, GRU, and RNN Algorithms,” Energies, vol. 16, no. 5. 2023, doi: 10.3390/en16052283.

S. Emshagin, W. K. Halim, and R. Kashef, “Short-term Prediction of Household Electricity Consumption Using Customized LSTM and GRU Models,” pp. 11–30, 2022.

J. Chung, C. Gulcehre, K. Cho, and Y. Bengio, “Gated Recurrent Neural Networks on Sequence Modeling,” 2014.

Published

2025-03-31

How to Cite

Dewi, N. P. N. P., Leu, Y., Mustofa, K., & Riasetiawan, M. . (2025). Enhancing Diesel Backup Power Forecasting With LSTM, GRU, and Autoencoder-based Input Encoding. Jurnal Nasional Pendidikan Teknik Informatika : JANAPATI, 14(1). https://doi.org/10.23887/janapati.v14i1.92079

Issue

Section

Articles