FEDERATED LEARNING WITH STOCHASTIC GRADIENT DESCENT FOR SMART METER ENERGY FORECASTING

Bharat Khushalani

Abstract


Background. Smart meters are widely used to monitor household energy consumption and help improve energy efficiency. However, collecting this data in a centralized location raises privacy concerns, as detailed consumption records can reveal sensitive household behavior. Federated learning provides an alternative approach by allowing models to be trained directly on user devices without sending raw data to a central server.

Materials and Methods. This study developed a simulation-based framework to test federated learning for forecasting short-term electricity usage. We created synthetic data representing hourly energy consumption for 100 simulated households, incorporating daily usage cycles and household-specific patterns. A simple neural network was trained locally on each household’s data using a standard optimization method, and model updates were shared with a central server to improve a shared global model.

Results and Discussion. The federated model achieved forecasting accuracy nearly equal to a traditional centralized model while keeping data private. Key factors affecting performance included how often devices were trained locally before sharing results and how many households participated in each training round. The approach remained accurate even when only half the devices contributed at any time. Compared to non-collaborative models trained independently by each household, the federated approach offered a substantial improvement in prediction accuracy. These findings show that good performance can be achieved while protecting user privacy and using simple models suitable for low-power devices.

Conclusions. This work shows that a well-designed simulation with realistic energy usage data can help evaluate federated learning methods under practical constraints. Even simple models, when trained in a decentralized and privacy-preserving way, can offer useful predictions for smart energy systems. The approach is suitable for real-world deployment and can help advance privacy-respecting energy analytics.

Keywords: Federated Learning, Smart Meters, Energy Forecasting, Stochastic Gradient Descent, Privacy-Preserving Machine Learning, Decentralized Optimization.


Full Text:

PDF

References


  1. Ragupathi, C., Dhanasekaran, S., Vijayalakshmi, N., and Salau, A. (2024). Prediction of electricity consumption using an innovative deep energy predictor model for enhanced accuracy and efficiency. Energy Reports, 12, 5320-5337. https://doi.org/10.1016/j.egyr.2024.11.018
  2. Kua, J., Hossain, M. B., Natgunanathan, I., and Xiang, Y. (2023). Privacy Preservation in Smart Meters: Current Status, Challenges and Future Directions. Sensors, 23(7), 3697. https://doi.org/10.3390/s23073697
  3. Yurdem, B., Kuzlu, M., Gullu, M., Catak, F. and Tabassum, M. (2024). Federated learning: Overview, strategies, applications, tools and future directions. Heliyon, 10(19), e38137, https://doi.org/10.1016/j.heliyon.2024.e38137 .
  4. Berkani, M. R. A., Chouchane, A., Himeur, Y., Ouamane, A., Miniaoui, S., Atalla, S., Mansoor, W., and Al-Ahmad, H. (2025). Advances in Federated Learning: Applications and Challenges in Smart Building Environments and Beyond. Computers, 14(4), 124. https://doi.org/10.3390/computers14040124
  5. Lazaros, K., Koumadorakis, D. E., Vrahatis, A. G., and Kotsiantis, S. (2024). Federated Learning: Navigating the Landscape of Collaborative Intelligence. Electronics, 13(23), 4744. https://doi.org/10.3390/electronics13234744
  6. Agripina, N.E.M.R., Shen, H. and Mafukidze, B.S. (2024) Advances, Challenges & Recent Developments in Federated Learning. Open Access Library Journal, 11, 1-1. doi: https://doi.org/10.4236/oalib.1112239.
  7. Wen, J., Zhang, Z., Lan, Y. et al. A survey on federated learning: challenges and applications. Int. J. Mach. Learn. & Cyber. 14, 513–535 (2023). https://doi.org/10.1007/s13042-022-01647-y
  8. Altamimi, E., Ali, A., Malluhi, Q. and Ali, A. (2024). Smart grid public datasets: Characteristics and associated applications. IET Smart Grid, 7(5), pp. 503-530, https://doi.org/10.1049/stg2.12161
  9. Fjellström, C., Nyström, K. (2022). Deep learning, stochastic gradient descent and diffusion maps. Journal of Computational Mathematics and Data Science, 4, 100054, https://doi.org/10.1016/j.jcmds.2022.100054
  10. Dagal, I., Tanrioven, K., Nayir, A. and Akın, B. (2025). Adaptive Stochastic Gradient Descent (SGD) for erratic datasets. Future Generation Computer Systems, 166, 107682, https://doi.org/10.1016/j.future.2024.107682
  11. Lai, C., Lai, Y., Kao, M. and Chen, M. (2025). Enhancing global model accuracy in federated learning with deep neuro-fuzzy clustering cyclic algorithm. Alexandria Engineering Journal, 112, 474-486, https://doi.org/10.1016/j.aej.2024.10.093
  12. Jain, P. et al. (2018). Parallelizing Stochastic Gradient Descent for Least Squares Regression: Mini-batching, Averaging, and Model Misspecification. Journal of Machine Learning Research, 18, 1-42.
  13. Alotaibi, B., Khan, F. A., and Mahmood, S. (2024). Communication Efficiency and Non-Independent and Identically Distributed Data Challenge in Federated Learning: A Systematic Mapping Study. Applied Sciences, 14(7), 2720. https://doi.org/10.3390/app14072720
  14. Shanmugarasa, Y., Paik, Hy., Kanhere, S.S. et al. (2023). A systematic review of federated learning from clients’ perspective: challenges and solutions. Artif Intell Rev 56 (Suppl 2), 1773–1827. https://doi.org/10.1007/s10462-023-10563-8
  15. Yang, F. et al. (2025). High-precision short-term industrial energy consumption forecasting via parallel-NN with Adaptive Universal Decomposition. Expert Systems with Applications, 289, 128366, https://doi.org/10.1016/j.eswa.2025.128366
  16. McMahan, H. et al. (2017). Communication-Efficient Learning of Deep Networks from Decentralized Data. Proc. 20th Int. Conf. on Artificial Intelligence and Statistics (AISTATS) 2017, Fort Lauderdale, Florida, USA. JMLR: WCP volume 54.
  17. Li, T. et al. (2020). Federated optimization in heterogeneous networks. Proceedings of the 3rd MLSys Conference, Austin, TX, USA.
  18. Karimireddy, S. et al. (2020). SCAFFOLD: Stochastic Controlled Averaging for Federated Learning. Proceedings of the 37th International Conference on Machine Learning, PMLR 119:5132-5143.
  19. Lian, X. (2015). Asynchronous Parallel Stochastic Gradient for Nonconvex Optimization. NIPS15, Proc. 29th Int. Conf. Neural Inf. Proc. Sys., Montreal, Canada.
  20. Aji, A. and Heafield, K. (2017). Sparse Communication for Distributed Gradient Descent. Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, D17-1045, Copenhagen, Denmark.
  21. Wu, Y. et al. (2024). An effective Federated Learning system for Industrial IoT data streaming. Alexandria Engineering Journal, 105, 414-422, https://doi.org/10.1016/j.aej.2024.07.040
  22. Zhou, T. et al. (2022). FEDformer: Frequency Enhanced Decomposed Transformer for Long-term Series Forecasting. Proceedings of the 39th International Conference on Machine Learning, Baltimore, Maryland, USA, PMLR 162, 2022.
  23. Kong, W. et al. (2019). Short-Term Residential Load Forecasting Based on LSTM Recurrent Neural Network. IEEE Transactions on Smart Grid, vol. 10, no. 1, pp. 841-851. doi: https://doi.org/10.1109/TSG.2017.2753802
  24. Chen, Y. et al. (2017). PeGaSus: Data-Adaptive Differentially Private Stream Processing. CCS ’17: Proceedings of the 2017 ACM SIGSAC Conference on Computer and Communications Security, 1375 - 1388, https://doi.org/10.1145/3133956.3134102
  25. Wu, L. (2024). SecTCN: Privacy-Preserving Short-Term Residential Electrical Load Forecasting. IEEE Transactions on Industrial Informatics, vol. 20, no. 2, pp. 2508-2518, doi: https://doi.org/10.1109/TII.2023.3292532
  26. Mendes, N. et al. (2024). Federated learning framework for prediction of net energy demand in transactive energy communities. Sustainable Energy, Grids and Networks, 40, 101522, https://doi.org/10.1016/j.segan.2024.101522
  27. Sabah, F. (2024). Model optimization techniques in personalized federated learning: A survey. Expert Systems with Applications, 243, 122874, https://doi.org/10.1016/j.eswa.2023.122874
  28. Sittijuk, P., Petrot, N., and Tamee, K. (2025). Robust Client Selection Strategy Using an Improved Federated Random High Local Performance Algorithm to Address High Non-IID Challenges. Algorithms, 18(2), 118. https://doi.org/10.3390/a18020118




DOI: http://dx.doi.org/10.30970/eli.30.2

Refbacks

  • There are currently no refbacks.