Papers
Topics
Authors
Recent
Search
2000 character limit reached

Energy-Efficient Ultra-Dense Network with Deep Reinforcement Learning

Published 25 Dec 2021 in cs.IT and math.IT | (2112.13189v1)

Abstract: With the explosive growth in mobile data traffic, ultra-dense network (UDN) where a large number of small cells are densely deployed on top of macro cells has received a great deal of attention in recent years. While UDN offers a number of benefits, an upsurge of energy consumption in UDN due to the intensive deployment of small cells has now become a major bottleneck in achieving the primary goals viz., 100-fold increase in the throughput in 5G+ and 6G. In recent years, an approach to reduce the energy consumption of base stations (BSs) by selectively turning off the lightly-loaded BSs, referred to as the sleep mode technique, has been suggested. However, determining the appropriate active/sleep modes of BSs is a difficult task due to the huge computational overhead and inefficiency caused by the frequent BS mode conversion. An aim of this paper is to propose a deep reinforcement learning (DRL)-based approach to achieve a reduction of energy consumption in UDN. Key ingredient of the proposed scheme is to use decision selection network to reduce the size of action space. Numerical results show that the proposed scheme can significantly reduce the energy consumption of UDN while ensuring the rate requirement of network.

Citations (24)

Summary

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.