TY - GEN
T1 - Cloud-Aided Interference Management with Cache-Enabled Edge Nodes and Users
AU - Shariatpanahi, Seyed Pooya
AU - Zhang, Jingjing
AU - Simeone, Osvaldo
AU - Khalaj, Babak Hossein
AU - Maddah-Ali, Mohammad Ali
N1 - Publisher Copyright:
© 2019 IEEE.
PY - 2019/7
Y1 - 2019/7
N2 - This paper considers a cloud-RAN architecture with cache-enabled multi-antenna Edge Nodes (ENs) that deliver content to cache-enabled end-users. The ENs are connected to a central server via limited-capacity fronthaul links, and, based on the information received from the central server and the cached contents, they transmit on the shared wireless medium to satisfy users' requests. By leveraging cooperative transmission as enabled by ENs' caches and fronthaul links, as well as multicasting opportunities provided by users' caches, a close-to-optimal caching and delivery scheme is proposed. As a result, the minimum Normalized Delivery Time (NDT), a high-SNR measure of delivery latency, is characterized to within a multiplicative constant gap of 3/2 under the assumption of uncoded caching and fronthaul transmission, and of one-shot linear precoding. This result demonstrates the interplay among fronthaul links capacity, ENs' caches, and end-users' caches in minimizing the content delivery time.
AB - This paper considers a cloud-RAN architecture with cache-enabled multi-antenna Edge Nodes (ENs) that deliver content to cache-enabled end-users. The ENs are connected to a central server via limited-capacity fronthaul links, and, based on the information received from the central server and the cached contents, they transmit on the shared wireless medium to satisfy users' requests. By leveraging cooperative transmission as enabled by ENs' caches and fronthaul links, as well as multicasting opportunities provided by users' caches, a close-to-optimal caching and delivery scheme is proposed. As a result, the minimum Normalized Delivery Time (NDT), a high-SNR measure of delivery latency, is characterized to within a multiplicative constant gap of 3/2 under the assumption of uncoded caching and fronthaul transmission, and of one-shot linear precoding. This result demonstrates the interplay among fronthaul links capacity, ENs' caches, and end-users' caches in minimizing the content delivery time.
UR - http://www.scopus.com/inward/record.url?scp=85073145094&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85073145094&partnerID=8YFLogxK
U2 - 10.1109/ISIT.2019.8849688
DO - 10.1109/ISIT.2019.8849688
M3 - Conference contribution
AN - SCOPUS:85073145094
T3 - IEEE International Symposium on Information Theory - Proceedings
SP - 737
EP - 741
BT - 2019 IEEE International Symposium on Information Theory, ISIT 2019 - Proceedings
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2019 IEEE International Symposium on Information Theory, ISIT 2019
Y2 - 7 July 2019 through 12 July 2019
ER -