Power Optimization in Heterogeneous Cellular Networks Using Deep Reinforcement Learning Based Sleep Control Algorithm
DOI:
https://doi.org/10.15662/IJEETR.2026.0802089Keywords:
Heterogeneous Cellular Networks (HetNets), Deep Reinforcement Learning (DRL), Energy Efficiency, Base Station Sleep Control, Small Cell Networks, Quality of Service (QoS)Abstract
The rapid growth of mobile users and increasing data traffic has significantly increased energy consumption in modern cellular networks. Heterogeneous Cellular Networks (HetNets), which consist of macro and small cells, are widely used to improve network coverage and capacity. However, the dense deployment of base stations leads to higher power consumption. In many cases, several base stations remain active even during low traffic conditions, resulting in unnecessary energy usage. Therefore, improving energy efficiency while maintaining network performance has become an important research challenge. This project proposes a Deep Reinforcement Learning (DRL) based sleep control algorithm to improve energy efficiency in heterogeneous cellular networks. The DRL agent continuously observes network parameters such as traffic load, number of active users, and base station utilization. Based on these observations, the agent intelligently decides whether a small cell base station should remain active or switch to sleep mode. During low traffic conditions, some small cells are dynamically switched to sleep mode to reduce energy consumption. When the traffic demand increases, the base stations can be reactivated to maintain service quality and network coverage. The proposed approach aims to reduce overall network power consumption while ensuring Quality of Service (QoS) and maintaining system performance. The effectiveness of the proposed DRL-based method is evaluated and compared with traditional algorithms such as Branch and Bound (BnB) and Dynamic methods. The results demonstrate that the DRL-based sleep control algorithm provides better energy efficiency and improved network performance in heterogeneous cellular networks
References
1. Zhang, H. Energy-efficient Strategies with Base Station Power Management for Green Wireless Networks. Thesis, University of Manitoba, Winnipeg, 2016.
2. Usama, M.; Erol-Kantarci, M. A Survey on Recent Trends and Open Issues in Energy Efficiency of 5G. School of Electrical Engineering and Computer Science, University of Ottawa, Ottawa, Canada.
3. Ge, X.; Yang, J.; Gharavi, H.; Sun, Y. Energy Efficiency Challenges of 5G Small Cell Networks. IEEE Communications Magazine, 2017, 55, 184–191.
4. Buzzi, S.; Li, C.; Klein, T.E.; Poor, H.V.; Yang, C.; Zappone, A. A Survey of Energy-Efficient Techniques for 5G Networks and Challenges Ahead. IEEE Journal on Selected Areas in Communications, 2016, 34, 697–709.
5. Lorincz, J.; Matijevic, T. Energy-efficiency analyses of heterogeneous macro and micro base station sites. Computers & Electrical Engineering, 2014, 40, 330–349.
6. Ryu, K.; Kim, W. Multi-Objective Optimization of Energy Saving and Throughput in Heterogeneous Networks Using Deep Reinforcement Learning. Sensors, 2021, 21, 37925.
7. Hsieh, C.-K.; Chan, K.-L.; Chien, F.-T. Energy-Efficient Power Allocation and User Association in Heterogeneous Networks with Deep Reinforcement Learning. Applied Sciences, 2021, 11, 4135.
8. Xue, Q.; Sun, Y.; Wang, J.; Feng, G.; Yan, L.; Ma, S. User-Centric Association in Ultra-Dense mmWave Networks via Deep Reinforcement Learning. IEEE Communications Letters, 2021, 25(11), 3594–3598.
9. Ju, H.; Kim, S.; Kim, Y.; Shim, B. Energy-Efficient Ultra-Dense Network with Deep Reinforcement Learning. 2021. Available online: https://arxiv.org/abs/2112.13189
10. Tiwari, A.K.; Mishra, P.K.; Pandey, S. Optimize Energy Efficiency through Base Station Switching and Resource Allocation for 5G Heterogeneous Networks. International Journal of Intelligent Systems and Applications in Engineering, 2023.
11. Chang, Y.; Chen, W.; Li, J.; Liu, J.; Wei, H.; Wang, Z. Collaborative Multi-BS Power Management for Dense Radio Access Network using Deep Reinforcement Learning, 2023. Available online: https://arxiv.org/abs/2304.07976
12. Pan, Z.; Yang, J. Deep Reinforcement Learning-Based Optimization Method for D2D Communication Energy Efficiency in Heterogeneous Cellular Networks. IEEE Access, 2024.
13. Imran, A.; Li, C.; Zhang, Y. Simulation-Based Optimization of Energy Harvesting Systems for 6G Networks using Deep Reinforcement Learning. Discover Networks, 2025.
14. C.Nagarajan and M.Madheswaran - ‘Stability Analysis of Series Parallel Resonant Converter with Fuzzy Logic Controller Using State Space Techniques’- Taylor &Francis, Electric Power Components and Systems, Vol.39 (8), pp.780-793, May 2011. DOI: 10.1080/15325008.2010.541746
15. C.Nagarajan and M.Madheswaran - ‘Experimental verification and stability state space analysis of CLL-T Series Parallel Resonant Converter’ - Journal of Electrical Engineering, Vol.63 (6), pp.365-372, Dec.2012. DOI: 10.2478/v10187-012-0054-2
16. C.Nagarajan and M.Madheswaran - ‘Performance Analysis of LCL-T Resonant Converter with Fuzzy/PID Using State Space Analysis’- Springer, Electrical Engineering, Vol.93 (3), pp.167-178, September 2011. DOI 10.1007/s00202-011-0203-9
17. S.Tamilselvi, R.Prakash, C.Nagarajan,“Solar System Integrated Smart Grid Utilizing Hybrid Coot-Genetic Algorithm Optimized ANN Controller” Iranian Journal Of Science And Technology-Transactions Of Electrical Engineering, DOI10.1007/s40998-025-00917-z,2025
18. S.Tamilselvi, R.Prakash, C.Nagarajan,“ Adaptive sliding mode control of multilevel grid-connected inverters using reinforcement learning for enhanced LVRT performance” Electric Power Systems Research 253 (2026) 112428, doi.org/10.1016/j.epsr.2025.112428
19. S.Thirunavukkarasu, C. Nagarajan, 2024, “Performance Investigation on OCF and SCF study in BLDC machine using FTANN Controller," Journal of Electrical Engineering And Technology, Volume 20, pages 2675–2688, (2025), doi.org/10.1007/s42835-024-02126-w
20. C. Nagarajan, M.Madheswaran and D.Ramasubramanian- ‘Development of DSP based Robust Control Method for General Resonant Converter Topologies using Transfer Function Model’- Acta Electrotechnica et Informatica Journal , Vol.13 (2), pp.18-31,April-June.2013, DOI: 10.2478/aeei-2013-0025.
21. C.Nagarajan and M.Madheswaran - ‘DSP Based Fuzzy Controller for Series Parallel Resonant converter’- Springer, Frontiers of Electrical and Electronic Engineering, Vol. 7(4), pp. 438-446, Dec.12. DOI 10.1007/s11460-012-0212-0.
22. C.Nagarajan and M.Madheswaran - ‘Experimental Study and steady state stability analysis of CLL-T Series Parallel Resonant Converter with Fuzzy controller using State Space Analysis’- Iranian Journal of Electrical & Electronic Engineering, Vol.8 (3), pp.259-267, September 2012.
23. C.Nagarajan and M.Madheswaran, “Analysis and Simulation of LCL Series Resonant Full Bridge Converter Using PWM Technique with Load Independent Operation” has been presented in ICTES’08, a IEEE / IET International Conference organized by M.G.R.University, Chennai.Vol.no.1, pp.190-195, Dec.2007
24. Suganthi Mullainathan, Ramesh Natarajan, “An SPSS and CNN modelling based quality assessment using ceramic materials and membrane filtration techniques”, Revista Materia (Rio J.) Vol. 30, 2025, DOI: https://doi.org/10.1590/1517-7076-RMAT-2024-0721
25. M Suganthi, N Ramesh, “Treatment of water using natural zeolite as membrane filter”, Journal of Environmental Protection and Ecology, Volume 23, Issue 2, pp: 520-530,2022
26. Dahrouj, H.; Yu, W.; Tang, T.; Alkhateeb, O. Energy Efficiency Optimization in Heterogeneous Cellular Networks: A Reinforcement Learning Approach. IEEE Transactions on Wireless Communications, 2022.
27. Chen, Y.; Zhang, N.; Zhang, Y.; Shen, X. Energy Efficient Base Station Sleep Control in 5G Networks Using Deep Reinforcement Learning. IEEE Internet of Things Journal, 2023.
28. Bennis, M.; Debbah, M.; Poor, H.V. Ultra-Dense Networks: A New Look at Energy Efficiency. IEEE Journal on Selected Areas in Communications, 2021.
29. Bi, S.; Zhang, R.; Ding, Z. Deep Reinforcement Learning for Energy Efficient Wireless Networks. IEEE Communications Magazine, 2024.
30. Li, J.; Ota, K.; Dong, M. AI-Based Energy Optimization for Future 5G and 6G Wireless Networks. IEEE Network, 2025
31. Sugumar, R. (2025). Cyber-Secure Cloud Architecture Integrating Network and API Controls for Risk-Aware SAP Healthcare Data Platforms. International Journal of Humanities and Information Technology, 7(4), 53-60.
32. Vimal, V. R., & Banerjee, J. S. (2025). Integrating PSO, GA, and ACO for Optimized ECG Feature Selection and Classification of Cardiac Disorders. SGS-Engineering & Sciences, 1(5).
33. Gopinathan, V. R. (2025). AI-Powered Kubernetes Orchestration for Complex Cloud-Native Workloads. International Journal of Research Publications in Engineering, Technology and Management (IJRPETM), 8(6), 13215-13225.
34. Mathew, A. From Conversation to Command Execution: A Comparative Threat Modeling and Risk Analysis of OpenClaw and ChatGPT. Risk, 100(1).
35. Inbavalli, M., & Arasu, T. (2015). Efficient Analysis of Frequent Item Set Association Rule Mining Methods. International Journal of Scientific & Engineering Research, 6(4).





