Main Authors: Gabriel Ferreira, Priscila Barreto, Marcos Caetano, Luis Pacheco - University of Brasilia

Focus Area: 

Artificial Intelligence

Who stands to benefit and how: 

Mobile network operators and end-users. Mobile networks benefit from improved spectral efficiency, providing better service while minimizing spectrum usage, that may reflect on a reduction of band licensing costs. End-users benefit from better service and lower costs.

Position Paper: 

5G networks promises Enhanced Mobile Communications (EMC), Massive Machine Type Communications (MMTC) and Ultra Reliable Low Latency Communications (URLLC). Besides these three scenarios, the scenario of "access to remote areas" can be included - in which a large cell is the main requirement. The number of mobile devices is increasing at an enormous rate with the advent of IoT, machine-to-machine communication and always-connected devices [1]. Those connected devices have wildly different traffic patterns and the infrastructure will have to support all kinds of traffics with different latency, throughput and packet loss requirements. The already crowded available radio spectrum is expected to become even more crowded and must be optimized. Radio resource allocation is essential to guarantee the minimum level of service, sub partitioning the channel in both time and frequency into resource blocks, that are assigned to different users.

The optimization of radio resource allocation is a NP-Hard [2] problem with multiple mutually exclusive goals: latency, throughput, fairness and spectrum efficiency. There are multiple heuristics already being used in real systems to improve resource scheduling but none of them gets near the optimal solution, which requires knowledge of the future and is not feasible of being computed in real-time.

A possible solution to the scheduling problem is to learn and use temporal and geographical correlations of users and their traffic. The real traffic data can be used to train an artificial neural network (ANN) that will allocate the resource blocks better than other competing algorithms, using the computing power of the cloud, while the trained network can be used on the edge cloud, along with the cloud RAN. The main counterarguments against the use of neural networks for radio resource scheduling are mostly focused on: how much time the neural network pipeline takes; if there is a generic ANN capable of efficiently scheduling resources for different scenarios and if the Key Performance Indicators (KPI) established on the SLAs are met.

The entire neural network pipeline should take no more than a few milliseconds, preventing additional delay to high priority traffic. A distributed execution framework suitable for real-time operation that includes a neural network pipeline is proposed in [3].

The guarantee of the SLAs is of utmost importance, and there are resource schedulers that strictly focuses on that [4], but those can result in poor service for non-guaranteed bitrate services. The QoS-based scheduler results can be used as a learning source for the supervised learning or can be used as a comparison base for a reinforced learning fitness score.
How efficiently a neural network can allocate resources depends on the architecture, the information that can be extracted from the input data, the temporal relation and the learning algorithm. The lack of publicly available mobile network traffic captures is a problem [5]. As a workaround, simulated traffic can be used to train and validate the neural network scheduler strategy, but the real-world performance would need to be validated with real traffic.

In the 5G-Range project we are working on a ANN based scheduler for the ns-3 simulator, using a supervised training approach with simulated traffic as the input and using the output of the Channel and QoS-Aware scheduler as the ground truth. The implementation will be tested in a 5G remote area scenario with opportunistic use of TVWS (TV White Spaces). The preliminary results produced by the trained scheduler are similar to the QoS-Aware ground truth. Further works will evaluate the use of an offline scheduler (e.g. genetic algorithm) to verify its capability to improve the ANN results.

[1] Ericsson. Mobility Report June 2019. Available in:

[2] W. Cheng, X. Cheng, T. Znati, X. Lu and Z. Lu, “The Complexity of Channel Scheduling in Multi-Radio Multi-Channel Wireless Networks” in IEEE INFOCOM 2009 - The 28th Conference on Computer Communications. doi: 10.1109/INFCOM.2009.5062068

[3] R. Nishihara, I. Stoica, P. Moritz, S. Wang, A. Tumanov, W. Paul, J. Schleier-Smith, R. Liaw, M. Niknami and M. I. Jordan, “Real-Time Machine Learning” in Proceedings of the 16th Workshop on Hot Topics in Operating Systems – HotOS’17. doi: 10.1145/3102980.3102998

[4] P. Ameigeiras, J. Navarro-Ortiz, P. Andres-Maldonado, J. M. Lopez-Soler, J. Lorca, Q. Perez-Tarrero and R. Garcia-Perez, “3GPP QoS-based scheduling framework for LTE” in EURASIP Journal on Wireless Communications and Networking, December 2016. doi: 10.1186/s13638-016-0565-9

[5] D. Naboulsi, M. Fiore, S. Ribot and R. Stanica, "Large-Scale Mobile Traffic Analysis: A Survey" in IEEE Communications Surveys & Tutorials, vol. 18, no. 1, pp. 124-161, Firstquarter 2016.
doi: 10.1109/COMST.2015.2491361