TY - GEN
T1 - Displacement error analysis of 6-DOF virtual reality
AU - Aksu, Ridvan
AU - Chakareski, Jacob
AU - Velisavljevic, Vladan
N1 - Publisher Copyright:
© 2019 Association for Computing Machinery.
PY - 2019/9/9
Y1 - 2019/9/9
N2 - Virtual view synthesis is a critical step in enabling Six-Degrees of Freedom (DoF) immersion experiences in Virtual Reality (VR). It comprises synthesis of virtual viewpoints for a user navigating the immersion environment, based on a small subset of captured viewpoints featuring texture and depth maps. We investigate the extreme values of the displacement error in view synthesis caused by depth map quantization, for a given 6DoF VR video dataset, particularly based on the camera settings, scene properties, and the depth map quantization error. We establish a linear relationship between the displacement error and the quantization error, scaled by the sine of the angle between the location of the object and the virtual view in the 3D scene, formed at the reference camera location. In the majority of cases the horizontal and vertical displacement errors induced at a pixel location of a reconstructed 360◦ viewpoint comprising the immersion environment are respectively proportional to 3/5 and 1/5 of the respective quantization error. Also, the distance between the reference view and the synthesized view severely increases the displacement error. Following these observations: displacement error values can be predicted for given pixel coordinates and quantization error, and this can serve as a first step towards modeling the relationship between the encoding rate of reference views and the quality of synthesized views.
AB - Virtual view synthesis is a critical step in enabling Six-Degrees of Freedom (DoF) immersion experiences in Virtual Reality (VR). It comprises synthesis of virtual viewpoints for a user navigating the immersion environment, based on a small subset of captured viewpoints featuring texture and depth maps. We investigate the extreme values of the displacement error in view synthesis caused by depth map quantization, for a given 6DoF VR video dataset, particularly based on the camera settings, scene properties, and the depth map quantization error. We establish a linear relationship between the displacement error and the quantization error, scaled by the sine of the angle between the location of the object and the virtual view in the 3D scene, formed at the reference camera location. In the majority of cases the horizontal and vertical displacement errors induced at a pixel location of a reconstructed 360◦ viewpoint comprising the immersion environment are respectively proportional to 3/5 and 1/5 of the respective quantization error. Also, the distance between the reference view and the synthesized view severely increases the displacement error. Following these observations: displacement error values can be predicted for given pixel coordinates and quantization error, and this can serve as a first step towards modeling the relationship between the encoding rate of reference views and the quality of synthesized views.
KW - 6DoF
KW - Depth-image-based rendering
KW - Omnidirectional video
KW - View synthesis
KW - Virtual Reality
UR - http://www.scopus.com/inward/record.url?scp=85073360938&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85073360938&partnerID=8YFLogxK
U2 - 10.1145/3349801.3349812
DO - 10.1145/3349801.3349812
M3 - Conference contribution
AN - SCOPUS:85073360938
T3 - ACM International Conference Proceeding Series
BT - ICDSC 2019 - 13th International Conference on Distributed Smart Cameras
PB - Association for Computing Machinery
T2 - 13th International Conference on Distributed Smart Cameras, ICDSC 2019
Y2 - 9 September 2019 through 11 September 2019
ER -