Abstract
Due to the proliferation of devices and the availability of computing servers, mobile-edge computing (MEC) has gained popularity in executing various computational tasks. MEC offers computing services at the network's edge, providing user equipment (UE) with reduced latency for their applications. However, determining a suitable offloading policy for UEs in MEC, considering wireless resource allocation and power management, is computationally demanding. Additionally, the problem is NP-hard, making it challenging to find an optimal solution within a reasonable time frame. In this work, we propose a meta-reinforcement learning (MRL)-based computational task offloading and power control mechanism for UEs in a resource-constrained environment of a MEC network to tackle the NP-hardness of the problem. We first develop an optimization problem to maximize UE computation efficiency by minimizing their power consumption for local computing and uplink transmission of UEs in the MEC network. We propose to use both binary offloading (full offloading or full local computing) and partial offloading schemes in the system. Our proposed MRL algorithm can figure out a suitable offloading policy for UEs within a short time. Unlike the traditional deep-reinforcement learning algorithms, our approach can resolve the issue of obtaining proper solutions in a new environment. Extensive simulation results prove the feasibility of our proposed work.
Original language | English (US) |
---|---|
Pages (from-to) | 16722-16730 |
Number of pages | 9 |
Journal | IEEE Internet of Things Journal |
Volume | 11 |
Issue number | 9 |
DOIs | |
State | Published - May 1 2024 |
All Science Journal Classification (ASJC) codes
- Signal Processing
- Information Systems
- Hardware and Architecture
- Computer Science Applications
- Computer Networks and Communications
Keywords
- Computation efficiency
- edge computing
- meta-reinforcement learning (MRL)
- power control
- task offloading