Abstract
While recent work explored streaming volumetric content on-demand, there is little effort on live volumetric video streaming that bears the potential of bringing more exciting applications than its on-demand counterpart. To fill this critical gap, in this paper, we propose MetaStream, which is, to the best of our knowledge, the first practical live volumetric content capture, creation, delivery, and rendering system for immersive applications such as virtual, augmented, and mixed reality. To address the key challenge of the stringent latency requirement for processing and streaming a huge amount of 3D data, MetaStream integrates several innovations into a holistic system, including dynamic camera calibration, edge-assisted object segmentation, cross-camera redundant point removal, and foveated volumetric content rendering. We implement a prototype of MetaStream using commodity devices and extensively evaluate its performance. Our results demonstrate that MetaStream achieves low-latency live volumetric video streaming at close to 30 frames per second on WiFi networks. Compared to state-of-the-art systems, MetaStream reduces end-to-end latency by up to 31.7% while improving visual quality by up to 12.5%.
Original language | English (US) |
---|---|
Pages (from-to) | 425-439 |
Number of pages | 15 |
Journal | Proceedings of the Annual International Conference on Mobile Computing and Networking, MOBICOM |
DOIs | |
State | Published - 2023 |
Event | 29th Annual International Conference on Mobile Computing and Networking, MobiCom 2023 - Madrid, Spain Duration: Oct 2 2023 → Oct 6 2023 |
All Science Journal Classification (ASJC) codes
- Computer Networks and Communications
- Hardware and Architecture
- Software
Keywords
- live volumetric video streaming
- multi-camera calibration
- point cloud synthesis
- volumetric content rendering