Both mobile edge cloud (MEC) and software-defined networking (SDN) are technologies for next generation mobile networks. In this paper, we simultaneously optimize energy consumption and quality of experience (QoE) in video streaming over software-defined mobile networks (SDMN) with MEC. Specifically, we propose to jointly consider buffer dynamics, video quality adaption, edge caching, video transcoding and transmission. We formulate two optimization problems which can be depicted as a constrained Markov decision process (CMDP) and a Markov decision process (MDP). Then we transform the CMDP problem into regular MDP by deploying Lyapunov technique. We utilize asynchronous advantage actor-critic (A3C) algorithm, one of the deep reinforcement learning (DRL) methods, to solve the corresponding MDP problems. Simulation results are presented to show that the proposed scheme can achieve the goal of energy saving and QoE enhancement with the corresponding constraints satisfied.

Additional Metadata
Keywords Adaptive video streaming, Deep reinforcement learning, Mobile edge cloud, Software defined mobile networks
Persistent URL dx.doi.org/10.1109/GLOBECOM38437.2019.9013634
Conference 2019 IEEE Global Communications Conference, GLOBECOM 2019
Citation
Luo, J. (Jia), Yu, F.R, Chen, Q. (Qianbin), Tang, L. (Lun), & Zhang, Z. (Zhicai). (2019). Adaptive video streaming in software-defined mobile networks: A deep reinforcement learning approach. In 2019 IEEE Global Communications Conference, GLOBECOM 2019 - Proceedings. doi:10.1109/GLOBECOM38437.2019.9013634